CN113581202B - Method, apparatus, and program product for determining environment information of autonomous vehicle - Google Patents

Method, apparatus, and program product for determining environment information of autonomous vehicle Download PDF

Info

Publication number
CN113581202B
CN113581202B CN202110833215.1A CN202110833215A CN113581202B CN 113581202 B CN113581202 B CN 113581202B CN 202110833215 A CN202110833215 A CN 202110833215A CN 113581202 B CN113581202 B CN 113581202B
Authority
CN
China
Prior art keywords
information
vehicle
environment
cloud server
perception
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110833215.1A
Other languages
Chinese (zh)
Other versions
CN113581202A (en
Inventor
尚进
丛炜
张晔
杨小枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoqi Intelligent Control Beijing Technology Co Ltd
Original Assignee
Guoqi Intelligent Control Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoqi Intelligent Control Beijing Technology Co Ltd filed Critical Guoqi Intelligent Control Beijing Technology Co Ltd
Priority to CN202110833215.1A priority Critical patent/CN113581202B/en
Publication of CN113581202A publication Critical patent/CN113581202A/en
Application granted granted Critical
Publication of CN113581202B publication Critical patent/CN113581202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/45Pedestrian sidewalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians

Abstract

The application provides a method, apparatus, and program product for determining environment information of an autonomous vehicle, relating to an autonomous driving technique, including: receiving environment perception information sent by a vehicle; performing fusion processing on the received environment perception information to obtain environment information of an area covered by the edge cloud server; and feeding back environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server. According to the scheme, environment perception fusion software running on the edge cloud server is used for receiving environment perception information sent by vehicles in the area, carrying out fusion processing on the environment perception information to obtain environment information of the area covered by the edge cloud, feeding back environment information outside the vehicles to the vehicles in the area, and enabling the vehicles to plan and control automatic driving of the vehicles by using the outside environment information. Under specific conditions, corresponding sensing equipment does not need to be deployed on the road side, the investment cost is low, the construction period is short, and updating and upgrading are easy.

Description

Method, apparatus, and program product for determining environment information of autonomous vehicle
Technical Field
The present disclosure relates to autonomous driving technology, and more particularly, to a method, apparatus, and program product for determining environmental information of an autonomous vehicle.
Background
At present, in order to improve the driving efficiency and the functional safety of automobiles, auxiliary driving systems are arranged in a plurality of vehicles. And the driving assisting system plans and controls the automatic driving of the vehicle according to the information of the vehicle and the environment perception information.
In the prior art, information such as vehicles and pedestrians is collected as a source of environment perception information of vehicle automatic driving through corresponding equipment arranged on the road side and is provided for corresponding vehicles or cloud software for use.
However, the existing scheme needs to deploy corresponding equipment on the road side, so that the investment cost is high, the construction period is long, and the updating and upgrading are not easy. Therefore, how to obtain the environmental awareness information in other better ways is a technical problem that needs to be solved urgently by those skilled in the art.
Disclosure of Invention
The application provides a method, equipment and a program product for determining environment information of an automatic driving vehicle, which aim to solve the problems that in the prior art, corresponding equipment needs to be deployed on the road side for obtaining the environment perception information of automatic driving of the vehicle, the investment cost is high, the construction period is long, and updating and upgrading are not easy.
According to a first aspect of the present application, there is provided a method of determining environmental information of an autonomous vehicle, the method being applied to an edge cloud server, comprising: receiving environment perception information sent by a vehicle; performing fusion processing on the received environment perception information to obtain environment information of an area covered by the edge cloud server; and feeding back environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server.
According to a second aspect of the present application, there is provided a method of acquiring environmental information of an autonomous vehicle, the method being applied to a vehicle, comprising: acquiring and sending environment perception information to an edge cloud server; the environment perception information is used for determining environment information of an area covered by the edge cloud server; receiving environment information outside the vehicle sent by an edge cloud server, wherein the environment information outside the vehicle is acquired from environment information of an area covered by the edge cloud server based on the running information of the vehicle.
According to a third aspect of the present application, there is provided an apparatus for determining environmental information of an autonomous vehicle, the apparatus being applied to an edge server, comprising: the receiving unit is used for receiving environment perception information sent by a vehicle; the fusion unit is used for carrying out fusion processing on the received environment perception information to obtain environment information of an area covered by the edge cloud server; and the sending unit is used for feeding back the environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server.
According to a fourth aspect of the present application, there is provided an apparatus for acquiring environmental information of an autonomous vehicle, the apparatus being applied to a vehicle, comprising: the sending unit is used for acquiring and sending environment perception information to the edge cloud server; the environment perception information is used for determining environment information of an area covered by the edge cloud server; the receiving unit is used for receiving environment information outside the vehicle, which is sent by the edge cloud server and is acquired from environment information of an area covered by the edge cloud server based on the running information of the vehicle.
According to a fifth aspect of the present application, there is provided an electronic device comprising a memory and a processor; wherein the memory is used for storing a computer program; the processor is configured to read the computer program stored in the memory, and execute the method for determining environment information of an autonomous vehicle according to the first aspect or the second aspect according to the computer program in the memory.
According to a sixth aspect of the present application, there is provided a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the method of determining environmental information of an autonomous vehicle according to the first or second aspect.
According to a seventh aspect of the present application, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the method of determining environmental information of an autonomous vehicle as described in the first aspect, the second aspect.
According to an eighth aspect of the present application, there is provided a system for obtaining environmental information of an autonomous vehicle, comprising an edge cloud server, a vehicle; the edge cloud server is configured to perform the method of determining environmental information of an autonomous vehicle according to the first aspect, and the vehicle is configured to perform the method of determining environmental information of an autonomous vehicle according to the second aspect.
Methods, apparatus, and program products for determining environment information for an autonomous vehicle are provided, including: receiving environment perception information sent by a vehicle; fusing the received environment perception information to obtain environment information of an area covered by the edge cloud server; and feeding back environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server. In the method, the device and the program product for determining the environmental information of the automatic driving vehicle, the vehicle within the coverage range of the edge cloud server sends the obtained environmental perception information to the edge cloud server, the edge cloud server performs fusion processing on the received environmental perception information to the environmental information of the area covered by the edge cloud server, and the environmental information outside the vehicle is fed back to the vehicle within the area covered by the edge cloud server according to the driving information of the vehicle and the environmental information of the area covered by the edge cloud server. The vehicle utilizes the received external environment information to plan and control the automatic driving of the vehicle. Corresponding equipment does not need to be deployed on the road side, the investment cost is low, the construction period is short, and the updating and upgrading are easy.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating a method for determining environmental information of an autonomous vehicle in accordance with a first exemplary embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a method of determining environmental information for an autonomous vehicle in accordance with a second exemplary embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for obtaining environment information of an autonomous vehicle according to a third exemplary embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for obtaining environment information of an autonomous vehicle according to a fourth exemplary embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a process for determining environment information of an autonomous vehicle according to an exemplary embodiment of the present application;
fig. 6 is a block diagram illustrating an apparatus for determining environment information of an autonomous vehicle according to a first exemplary embodiment of the present application;
fig. 7 is a block diagram illustrating an apparatus for determining environment information of an autonomous vehicle according to a second exemplary embodiment of the present application;
fig. 8 is a structural diagram of an apparatus for acquiring environment information of an autonomous vehicle according to a third exemplary embodiment of the present application;
fig. 9 is a block diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
In order to improve the driving efficiency and functional safety of automobiles, a number of vehicles are provided with a driving assistance system. The driving assisting system plans and controls the automatic driving of the vehicle according to the information of the vehicle and the environment perception information. In the prior art, information such as vehicles and pedestrians is collected as a source of environment perception information of vehicle automatic driving through corresponding equipment arranged on the road side and is provided for corresponding vehicles or cloud software for use.
However, the existing scheme needs to deploy corresponding equipment on the road side, so that the investment cost is high, the construction period is long, and the updating and upgrading are not easy. Therefore, how to obtain the environmental awareness information in other better ways is a technical problem that needs to be solved urgently by those skilled in the art.
In order to solve the technical problem, according to the scheme provided by the application, environment perception information sent by vehicles in the region is received through environment perception fusion software running on the edge cloud server, fusion processing is carried out on the environment perception information to obtain environment information of the region covered by the edge cloud server, environment information outside the vehicles is fed back to all the vehicles in the region, and the vehicles can plan and control automatic driving of the vehicles by using the environment information outside the vehicles. Under specific conditions, corresponding sensing equipment does not need to be deployed on the road side, and the new environment sensing fusion scheme has the advantages of low investment cost, short construction period and easiness in updating and upgrading.
Fig. 1 is a flowchart illustrating a method for determining environment information of an autonomous vehicle according to a first exemplary embodiment of the present application. The method for determining the environment information of the automatic driving vehicle provided by the embodiment can be applied to an edge cloud server.
As shown in fig. 1, the method for determining environment information of an autonomous vehicle according to the present embodiment includes:
step 101, receiving environment perception information sent by a vehicle.
The method provided by the present application may be executed by an electronic device with computing capability, for example, a computer or other devices. The electronic equipment can receive environment perception information reported by vehicles. The electronic device may be, for example, an edge cloud server, and the edge cloud server may be in the form of a cluster server, a distributed server, and the like, which is not limited in particular.
The environment perception information may include, among other things, the vehicle's own driving information and the vehicle's external perception information perceived by the vehicle.
In an implementation manner, when the vehicle is started, a sensor arranged on the vehicle can acquire environment information outside the vehicle, for example, point cloud data outside the vehicle is acquired through a radar, and for example, a picture outside the vehicle can be shot through a camera. In one embodiment, the vehicle may process the sensor data to obtain a sensing result, and report the sensing result to the edge cloud server, and in another embodiment, the vehicle may directly report the sensor data to the edge cloud server.
Further, the vehicle may also report the vehicle running information, such as the position, speed, running direction, and the like of the vehicle, to the edge cloud server.
Specifically, the vehicle may access the edge cloud server, specifically may access the edge cloud server covering the area where the vehicle is currently located after the vehicle is started, for example, the vehicle may send the start information to the edge cloud server. The edge cloud server covering the area where the vehicle is located receives vehicle starting information sent by the vehicle, and then a communication channel can be established between the edge cloud server and the vehicle for mutual communication between the edge cloud server and the vehicle. With this communication channel, the edge cloud server may receive the context awareness information sent by the vehicle. Specifically, a communication channel with the vehicle may be established by the edge cloud server.
Specifically, the edge cloud server refers to a server in a machine room at a base station, where the base station refers to a base station that can satisfy real-time performance, reliability, and bandwidth guarantee, such as a 5G base station.
Further, if there is more than one edge cloud server covering the area where the vehicle is located, all the edge cloud servers covering the area where the vehicle is located may establish a communication channel with the vehicle and receive the environment perception information sent by the vehicle.
Further, as the vehicle travels, if the vehicle travels away from the coverage of the previous edge cloud server and enters the coverage of the next edge cloud server, the previous edge cloud server may not receive the environment awareness information sent by the vehicle, and the next edge cloud server may receive the environment awareness information sent by the vehicle.
And 102, fusing the received environment perception information to obtain environment information of an area covered by the edge cloud server.
The fusion processing means that the edge cloud server performs fusion calculation on the environment perception information reported by each vehicle, and finally obtains the environment information of the area covered by the edge cloud server.
Optionally, because the environment of the area covered by the edge cloud changes in real time, the environment perception information reported by each vehicle and used for representing the same time can be fused. For example, the vehicle 1 reports the information 1 at the time t, and the vehicle 2 reports the information 2 at the time t, so that the edge cloud can fuse the information 1 and the information 2 to obtain the environmental awareness information at the time t.
The area covered by the edge cloud server is the area of the network service provided by the edge cloud.
The environment information comprises various information such as vehicle information, pedestrian information, traffic light state information and the like in the coverage area of the edge cloud server. In one implementation, the environmental information may also include lane marking information, zebra marking information, and the like.
In particular, in the solution provided by the present application, it is assumed that the vehicle distribution in the area covered by the edge cloud server is uniform. When the number of vehicles in the area covered by the edge cloud server is large enough, the edge cloud server does not receive the environment perception information sent by all vehicles in the area covered by the edge cloud server, or the edge cloud server does not perform fusion processing on all the received environment perception information. When the environmental perception information sent by the vehicles and received by the edge cloud server is enough and the environmental information of the covered area can be constructed, the edge cloud server does not receive the environmental perception information sent by other vehicles in the area any more, or does not perform fusion processing on the environmental perception information sent by other vehicles in the area. If the vehicle to which the environment perception information received by the edge cloud server belongs is driven out of the coverage area of the edge cloud server, the edge cloud server can receive the environment perception information sent by other vehicles in the coverage area until all the received environment perception information is enough to construct the environment information of the coverage area.
Furthermore, according to different actual environment conditions of the areas covered by different edge cloud servers, the quantity of environment perception information required for constructing the environment information of the covered areas is different.
Specifically, the number of the environment sensing information sent by the vehicle received by the edge cloud server can be set in advance according to the actual environment condition. For example, if the edge cloud server is configured to receive environment awareness information sent by at most 50 vehicles, the edge cloud server may filter 50 pieces of environment awareness information from the received environment awareness information, and construct the environment information of the covered area based on the 50 pieces of environment awareness information. For example, the environment awareness information may be filtered according to the location of each vehicle, and specifically, 50 vehicles uniformly distributed in the coverage area of the edge cloud server may be filtered.
If the vehicles in the area are enough, the edge cloud server can discard the environment perception information sent by the 51 st vehicle and the vehicles behind the 51 st vehicle without performing fusion processing on the environment perception information.
When some of the 50 vehicles leave the area covered by the edge cloud server, the edge cloud server may filter other vehicles in the covered area, and perform fusion processing on the environmental awareness information, so as to ensure that 50 pieces of environmental awareness information are used as fused original data.
And 103, feeding back environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server.
The vehicle-mounted running information of the vehicle includes information such as a position, a running direction, and a speed of the vehicle.
The edge cloud server can determine environment information outside the vehicle from the environment information of the area covered by the edge cloud server according to the running information of the vehicle.
The environment information outside the vehicle is environment information within a certain range of the vehicle, including information of other vehicles, pedestrian information, state information of traffic lights, and the like, in addition to the running information of the vehicle itself, and may further include information such as zebra crossing information and lane crossing information.
Specifically, if the environment perception information sent by the vehicle and received by the edge server reaches a certain amount, and the environment information of the area covered by the edge cloud server can be constructed through fusion calculation, the edge cloud server filters out the environment information outside the vehicle from the obtained environment information of the area covered by the vehicle according to the vehicle information of the vehicle and sends the environment information to the vehicle. Specifically, the edge cloud server sends external environment information corresponding to the vehicles to all vehicles within the coverage area of the edge cloud server.
Further, if the number of vehicles in the coverage area of the edge cloud server is not enough, the environment perception information sent by the vehicles and received by the edge cloud server is not enough to construct the environment information of the coverage area of the edge cloud server through fusion calculation. In this case, the edge cloud does not send the environmental information corresponding to the vehicle.
Methods, apparatus, and program products for determining environmental information for an autonomous vehicle are provided, including: receiving environment perception information sent by a vehicle; performing fusion processing on the received environment perception information to obtain environment information of an area covered by the edge cloud server; and feeding back environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server. According to the scheme provided by the application, the environment perception information obtained by the vehicles is sent to the edge cloud server through the vehicles in the coverage area of the edge cloud server, the edge cloud server performs fusion processing on the received environment perception information to the environment information of the area covered by the edge cloud server, and the environment information outside the vehicles is fed back to all the vehicles in the area covered by the edge cloud server according to the running information of the vehicles and the environment information of the area covered by the edge cloud server. The vehicle utilizes the received external environment information to plan and control the automatic driving of the vehicle. Corresponding equipment does not need to be deployed on the road side, the investment cost is low, the construction period is short, and updating and upgrading are easy.
Fig. 2 is a flowchart illustrating a method for determining environment information of an autonomous vehicle according to a second exemplary embodiment of the present application. The method for determining environment information of the automatic driving vehicle provided by the embodiment can be applied to an edge cloud server.
As shown in fig. 2, the method for determining environment information of an autonomous vehicle according to the present embodiment includes:
step 201, receiving environment perception information sent by a vehicle.
Step 201 is similar to step 101 in implementation manner and principle, and is not described again.
And 202, obtaining the driving information of the vehicle included in the environment perception information, and correcting the driving information of the vehicle according to the delay information of the received environment perception information to obtain the corrected driving information of the vehicle.
The environment perception information sent by the vehicle and acquired by the edge cloud server comprises the running information of the vehicle and the external perception information of the vehicle. The vehicle running information may include position information, running direction information, and speed information of the vehicle.
When the vehicle sends the environment perception information, the edge cloud receives the environment perception information sent by the vehicle, a time delay exists in the process, and the delay information of the edge cloud server for receiving the environment perception information can be preset. The edge cloud server can correct the self-vehicle running information through calculation according to the delay information to obtain the corrected self-vehicle running information, namely the actual self-vehicle running information of the vehicle at the moment when the edge cloud server receives the environment perception information. For example, the position of the vehicle may be corrected based on the speed and delay information of the vehicle.
Step 203, obtaining external perception information included in each environment perception information, and determining the correction perception information of each object in the environment according to each external perception information.
The environment perception information sent by the vehicle and acquired by the edge cloud server comprises the running information of the vehicle and the external perception information of the vehicle. The external perception information may include information of other vehicles around the vehicle, pedestrian information, and status information of traffic lights, which can be perceived by the vehicle, and in an implementation manner, may also include zebra crossing information, lane crossing information, and the like. The other vehicle information includes position information, traveling direction information, and speed information of the other vehicle. The pedestrian information includes position information, direction information, and speed information of the pedestrian. The state information of the traffic lights refers to the indication information of the traffic lights.
Specifically, the vehicle may acquire external perception information of the vehicle through an electronic device such as a camera or a radar provided in the vehicle.
Wherein, each vehicle, each pedestrian or each traffic light in the external perception information can be taken as an object. In one implementation, each zebra crossing or each lane marking may also be an object.
The external perception information in the environment perception information sent by the vehicle and received by the edge cloud server may have errors. For example, if a certain sensor provided in the vehicle has a certain error, the external sensing information acquired by the sensor may also have an error. The edge cloud server can fuse external perception information in the received environment perception information sent by each vehicle, and finally determines the correction perception information of each object through fusion calculation.
In an implementation manner, if the plurality of external perception information includes perception information of the same object, the correction perception information of the object is determined according to the plurality of perception information of the object. For example, if the external sensing information reported by the vehicle a at the time t includes the information of the vehicle C, and the external sensing information reported by the vehicle B at the time t also includes the information of the vehicle C, the two pieces of information may be processed to obtain the corrected sensing information of the vehicle C.
In particular, an object may also have a plurality of different types of perceptual information. For example, the perception information of other vehicles may include position perception information, driving direction perception information, and speed perception information of the vehicle; the perception information of the pedestrian may include position perception information, direction perception information, and speed perception information of the pedestrian.
Further, when determining the corrective perception information of the object according to the plurality of perception information of the object, the corrective perception information of the class of the object may be determined according to the plurality of information of the same class of the object.
For example, the pedestrian a is an object, the plurality of pieces of external perception information received by the edge cloud server all include information of the pedestrian a, and each piece of information of the pedestrian a includes position perception information, so that the correction perception information of the position of the pedestrian a can be determined through the plurality of pieces of position perception information of the pedestrian a. Similarly, the correction perception information of the direction of the pedestrian A and the correction perception information of the speed of the pedestrian A can be determined. The correction perception information of the pedestrian A can be formed by the correction perception information of the position, the direction and the speed of the pedestrian A.
Specifically, the edge cloud server may determine the correction perception information of the object according to all the received perception information of the same object in the external perception information.
In one implementation, the comparison result is obtained by comparing a plurality of perception information of the object.
Specifically, the edge cloud server may compare the same type of sensing information in all the sensing information of the object to obtain a comparison result, where the comparison result may be, for example, a difference between any two sensing information, or a difference between each sensing information and a mean value of the sensing information.
In one case, if the comparison result indicates that the difference between the pieces of perception information is smaller than the preset difference value, the average value of the pieces of perception information is used as the corrected perception information of the object.
The preset difference value is a value preset according to actual conditions. For example, the difference value is preset in the position information, the difference value is preset in the driving direction information, and the difference value is preset in the speed information of other vehicles in the information of other vehicles. The pedestrian information comprises a preset difference value of position information, a preset difference value of direction information and a preset difference value of speed information of pedestrians. The indication information of the traffic lights in the state information of the traffic lights is preset with a difference value.
If the difference value obtained by comparing every two sensing information of the same type in all the sensing information of the object by the edge cloud server is smaller than the preset difference value, taking the average value of the sensing information of the type of the object as the correction sensing information of the type of the object. The plurality of corrective perceptions of the subject constitute corrective perceptions of the subject.
Wherein the corrective perceptual information of the subject characterizes an actual state of the subject at the current time. For example, if an object in the coverage area of the edge server is a pedestrian a, and the edge cloud server performs fusion calculation on all perception information of the pedestrian a to obtain the correction perception information of the object pedestrian a, the pedestrian a moves towards the south at the position B in the direction and at the speed of 3 m/s, it can be considered that the pedestrian a moves towards the south at the speed of 3 m/s at the position B at the current moment.
And if the comparison result represents that abnormal perception information exists, and the difference between the abnormal perception information and at least two other perception information is larger than a preset difference value, taking the average value of other perception information except the abnormal perception information as the correction perception information of the object.
Specifically, for example, an object within the coverage area of the edge cloud server is a pedestrian a, and a difference between one piece of location awareness information of the pedestrian a and two or more pieces of location awareness information of the pedestrian a is greater than a preset location information difference value, so that the piece of location awareness information of the pedestrian a is referred to as abnormal awareness information.
After all the abnormal perception information in all the position perception information of the pedestrian A is removed, the average value of other position perception information is used as the correction perception information of the position perception information of the pedestrian A. And obtaining the correction perception information of the direction perception information and the correction perception information of the speed perception information of the pedestrian A in the same way. The correction perception information of the position perception information, the correction perception information of the direction perception information and the correction perception information of the speed perception information of the pedestrian A jointly form the correction perception information of the pedestrian A.
The execution timing between step 202 and step 203 is not limited.
And 204, determining the environmental information of the area covered by the edge cloud server according to the corrected running information of the vehicle and the corrected perception information of the objects.
Specifically, the objects in the area covered by the edge cloud server may include all vehicles, pedestrians, traffic lights, and lane lines, zebra stripes, and the like.
The corrected own vehicle travel information is the standard. The corrected driving information of the vehicle without correction is based on the corrected sensing information of other vehicles. In one implementation manner, the corrected driving information of the vehicle, the corrected sensing information of other vehicles, the corrected sensing information of the pedestrian and the corrected sensing information of the traffic light state information jointly form the environmental information of the area covered by the edge cloud server.
The edge cloud server can determine the positions of the objects in the environment according to the corrected running information of the vehicles and the corrected perception information of the objects, so that the objects can be spliced to obtain the relative position relation of the objects in the area covered by the edge cloud server, and parameters such as speed parameters and direction parameters can be given to the objects in the environment according to the corrected running information of the vehicles and the corrected perception information of the objects to form the environment information of the area covered by the edge cloud server.
Step 205, obtaining environmental information within a preset distance from the vehicle position from the environmental information of the area covered by the edge cloud server, and sending the obtained environmental information to the vehicle.
The preset distance is preset according to actual conditions.
Specifically, the environment information sent to the vehicle by the edge cloud server is environment information within a preset distance from the position of the vehicle, and may also be external perception information of the vehicle, but does not include the running information of the vehicle.
Further, the edge cloud server may obtain environment information outside the vehicle according to the position of the vehicle and also according to the road environment of the position of the vehicle.
In one implementation, for example, if the vehicle is traveling in an urban area and there is an isolation zone between vehicles in different directions, the environment information sent by the edge server to the vehicle includes other vehicle information in the same direction within 500 meters from the vehicle, pedestrian information within 30 meters from the vehicle, status information of traffic lights within 500 meters ahead of the vehicle, zebra crossing information within 30 meters ahead of the vehicle, and lane information within 500 meters from the vehicle. If the vehicles are running on the highway and isolation zones exist among vehicles in different directions, the environment information sent by the edge server to the vehicles can comprise other vehicle information in the same direction within 1000 meters of the vehicles. If there is no isolation zone between vehicles in different directions, the other vehicle information in the environment information sent to the vehicle by the edge cloud server also includes vehicle information in the direction opposite to the vehicle. The other vehicle information may include position information, driving direction information, and speed information of the other vehicle, among others. The pedestrian information may include position information, direction information, and speed information of the pedestrian.
Fig. 3 is a flowchart illustrating a method for acquiring environment information of an autonomous vehicle according to a third exemplary embodiment of the present application. The method for acquiring environment information of an autonomous vehicle provided by the embodiment can be applied to vehicles.
As shown in fig. 3, the method for acquiring environment information of an autonomous vehicle according to the present embodiment includes:
step 301, acquiring and sending environment perception information to an edge cloud server; the environment perception information is used for determining environment information of an area covered by the edge cloud server.
In an implementation manner, when the vehicle is started, a sensor arranged on the vehicle can acquire environment information outside the vehicle, for example, point cloud data outside the vehicle is acquired through a radar, and for example, a picture outside the vehicle can be shot through a camera. In one embodiment, the vehicle may process the sensor data to obtain a sensing result, and report the sensing result to the edge cloud server, and in another embodiment, the vehicle may directly report the sensor data to the edge cloud server.
Specifically, the vehicle may access the edge cloud server, and specifically, the edge cloud server covering the area where the vehicle is currently located may be accessed after the vehicle is started. For example, the vehicle may send startup information to the edge cloud server. After the edge cloud server covering the area where the vehicle is located receives vehicle starting information sent by the vehicle, a communication channel can be established between the edge cloud server and the vehicle, and the communication channel is used for mutual communication between the edge cloud server and the vehicle. With this communication channel, the vehicle may send context awareness information to the edge cloud server. Specifically, a communication channel with the vehicle can be established by the edge cloud server.
Further, if there is more than one edge cloud server covering the area where the vehicle is located, the vehicle may establish a communication channel with all the edge cloud servers covering the area where the vehicle is located, and send the environment awareness information to all the edge cloud servers covering the area where the vehicle is located.
Further, with the running of the vehicle, if the vehicle leaves the coverage area of the previous edge cloud server and enters the coverage area of the next edge cloud server, the vehicle only sends the environmental awareness information to the edge cloud server currently covering the area where the vehicle is located.
Specifically, the edge cloud server may determine the environmental information of the area covered by the edge cloud server by using the environmental awareness information sent by the vehicle.
Step 302, receiving environment information outside the vehicle sent by the edge cloud server, wherein the environment information outside the vehicle is acquired from the environment information of the area covered by the edge cloud server based on the running information of the vehicle.
The vehicle running information of the vehicle may include position information, running direction information, and speed information of the vehicle.
The edge cloud server can determine environment information outside the vehicle from the environment information of the area covered by the edge cloud server according to the running information of the vehicle. The vehicle may receive environment information outside the vehicle transmitted by the edge meta-server.
The environment information outside the vehicle is environment information within a certain range of the vehicle, including information of other vehicles, pedestrian information, state information of traffic lights, and the like, in addition to the running information of the vehicle itself, and may further include information such as zebra crossing information and lane crossing information.
Specifically, if enough environment perception information is sent to the edge cloud server by the vehicles in the area covered by the edge cloud server, the edge cloud server can construct the environment information of the area covered by the edge cloud server through fusion computing. All vehicles within the coverage range of the edge cloud server can receive the environment information which is sent by the edge cloud server and corresponds to the vehicles. The vehicle may schedule and control operation of the vehicle based on the received environmental information corresponding to the vehicle.
Further, if the environmental perception information sent to the edge cloud server by the vehicle in the area covered by the edge cloud server is insufficient, the edge cloud server cannot construct the environmental information of the area covered by the edge cloud server through fusion computing. And all vehicles in the coverage range of the edge cloud server can not receive the environment information which is sent by the edge cloud server and corresponds to the vehicles, and under the condition, the vehicles are planned and controlled to carry out degraded running, namely the vehicles are planned and controlled to be switched to a manual driving mode.
Fig. 4 is a flowchart illustrating a method for acquiring environment information of an autonomous vehicle according to a fourth exemplary embodiment of the present application. The method for acquiring the environment information of the automatic driving vehicle provided by the embodiment can be applied to vehicles.
As shown in fig. 4, the method for acquiring environment information of an autonomous vehicle according to the present embodiment includes:
step 401, acquiring and sending environment perception information to an edge cloud server; the environment perception information is used for determining environment information of an area covered by the edge cloud server; the environment perception information comprises the running information of the self vehicle and the external perception information.
Step 401 is the same as the manner of obtaining the environment awareness information and sending the environment awareness information to the edge cloud server in step 301, and is not repeated.
In this embodiment, the environment sensing information may specifically include the driving information of the vehicle and the external sensing information.
The environment perception information sent by the vehicle to the edge cloud server comprises the running information of the vehicle and the external perception information of the vehicle, which can be perceived by the vehicle.
The external perception information may include other vehicle information, pedestrian information, and state information of traffic lights around the vehicle, which can be perceived by the vehicle, and in an implementation manner, may also include zebra crossing information, lane crossing information, and the like. The other vehicle information includes position information, traveling direction information, and speed information of the other vehicle. The pedestrian information includes position information, direction information, and speed information of the pedestrian. The state information of the traffic lights refers to the indication information of the traffic lights.
Specifically, the vehicle may acquire external perception information of the vehicle through an electronic device such as a camera or a radar provided in the vehicle.
Further, the vehicle may also report the vehicle running information, such as the position, running direction, speed, and the like of the vehicle, to the edge cloud server.
Step 402, receiving environment information outside the vehicle sent by the edge cloud server, wherein the environment information outside the vehicle is acquired from the environment information of the area covered by the edge cloud server based on the running information of the vehicle.
Step 402 is similar to step 302 in implementation and principle, and is not described again.
FIG. 5 is a schematic diagram illustrating a process for determining environment information of an autonomous vehicle according to an exemplary embodiment of the present application.
As shown in fig. 5, after the vehicle is started, start information is sent to an edge cloud server covering an area where the vehicle is currently located. The edge cloud server covering the area where the vehicle is located receives vehicle starting information sent by the vehicle, and then the edge cloud server can establish a communication channel between the edge cloud server and the vehicle for mutual communication between the edge cloud server and the vehicle.
When the vehicle starts, a sensor arranged on the vehicle can acquire environmental information outside the vehicle, in one embodiment, the vehicle can process the sensor data to obtain a sensing result and report the sensing result to the edge cloud server, and in another embodiment, the vehicle can directly report the sensor data to the edge cloud server. Specifically, the perception information outside the vehicle may include various information such as other vehicle information, pedestrian information, status information of traffic lights, and the like, where the other vehicle information may include a position, a driving direction, and speed information of the vehicle; the pedestrian information may include position, direction, speed information of the pedestrian. Further, the vehicle may also report the vehicle running information, such as the position, running direction, speed, and the like of the vehicle, to the edge cloud server. The vehicle external perception information and the vehicle running information constitute environment perception information of the vehicle. The vehicle sends the context awareness information to the edge cloud server.
After receiving the environment perception information sent by the vehicle, the edge cloud server performs fusion calculation on the received environment perception information, and if the environment perception information received by the edge cloud server is enough, the environment information of the area covered by the edge cloud server can be constructed.
According to the running information of the vehicle and the environment information of the coverage range of the edge cloud server, the edge cloud server feeds back the environment information outside the vehicle to the vehicle through the communication channel. The environment information outside the vehicle refers to environment information within a certain range of the vehicle, including information of other vehicles, pedestrian information, state information of traffic lights, and the like, in addition to the running information of the vehicle itself, and may further include information such as zebra crossing information and lane line information.
Specifically, the operations of the fusion calculation, the calculation of the fed-back environment perception information outside the vehicle, and the like are completed by environment perception fusion software in the edge cloud server.
All vehicles within the coverage of the edge cloud server can receive the environment information outside the vehicle sent by the edge server. The vehicle uses the received external environmental information to plan and control the automatic driving of the vehicle.
Fig. 6 is a block diagram illustrating an apparatus for determining environment information of an autonomous vehicle according to a first exemplary embodiment of the present application.
The device for determining the environment information of the automatic driving vehicle provided by the embodiment can be applied to an edge cloud server.
As shown in fig. 6, the present application provides an apparatus 600 for determining environment information of an autonomous vehicle, comprising:
a receiving unit 610, configured to receive environment awareness information sent by a vehicle;
the fusion unit 620 is configured to perform fusion processing on the received environment perception information to obtain environment information of an area covered by the edge cloud server;
the sending unit 630 is configured to feed back environment information outside the vehicle to the vehicle according to the vehicle driving information of the vehicle and the environment information of the area covered by the edge cloud server.
Fig. 7 is a block diagram illustrating an apparatus for determining environment information of an autonomous vehicle according to a second exemplary embodiment of the present application.
The device for determining environment information of the automatic driving vehicle provided by the embodiment can be applied to an edge cloud server.
As shown in fig. 7, in the apparatus 700 for determining environment information of an autonomous vehicle provided by the present application on the basis of the above-mentioned embodiment, the fusion unit 620 includes: the self-driving information correction module 621 is configured to acquire self-driving information included in the environment sensing information, correct the self-driving information according to delay information of the received environment sensing information, and obtain corrected self-driving information;
an external sensing information correcting module 622, configured to obtain external sensing information included in each piece of environment sensing information, and determine, according to each piece of external sensing information, correction sensing information of each object in the environment;
and the environment information determining module 623 is configured to determine environment information of an area covered by the edge cloud server according to the corrected driving information of the vehicle and the corrected sensing information of each object.
The external perception information correcting module 622 is specifically configured to, if the plurality of pieces of external perception information include perception information of the same object, determine correction perception information of the object according to the plurality of pieces of perception information of the object.
In an implementation manner, the external perception information correction module 622 may be further configured to compare a plurality of perception information of the object to obtain a comparison result; if the comparison result represents that the difference between the perception information is smaller than the preset difference value, taking the average value of the perception information as the correction perception information of the object; and if the comparison result represents that abnormal perception information exists, and the difference between the abnormal perception information and at least two other perception information is larger than a preset difference value, taking the average value of other perception information except the abnormal perception information as the correction perception information of the object.
The sending unit 630 is specifically configured to obtain environment information within a preset distance from a location of the vehicle from the environment information of the area covered by the edge cloud server, and send the obtained environment information to the vehicle.
Fig. 8 is a block diagram of an apparatus for acquiring environment information of an autonomous vehicle according to a third exemplary embodiment of the present application.
The apparatus for acquiring environment information of an autonomous vehicle provided by the present embodiment may be applied to a vehicle.
As shown in fig. 8, the apparatus 800 for acquiring environment information of an autonomous vehicle according to the present application includes:
a sending unit 810, configured to obtain and send environment awareness information to an edge cloud server; the environment perception information is used for determining environment information of an area covered by the edge cloud server;
a receiving unit 820, configured to receive environment information outside the vehicle sent by the edge cloud server, where the environment information outside the vehicle is acquired from environment information of an area covered by the edge cloud server based on the vehicle traveling information of the vehicle.
In an alternative embodiment, the environment sensing information in the transmitting unit 810 includes the driving information of the vehicle and the external sensing information.
Fig. 9 is a block diagram of an electronic device according to an exemplary embodiment of the present application.
As shown in fig. 9, the electronic device provided in the present embodiment includes an edge cloud server and a vehicle, and includes:
a memory 901;
a processor 902; and
a computer program;
wherein a computer program is stored in the memory 901 and configured to be executed by the processor 902 to implement any of the above methods of determining autonomous vehicle environment information.
The present embodiments also provide a computer-readable storage medium, having stored thereon a computer program,
the computer program is executed by a processor to implement any of the above methods of determining environment information for an autonomous vehicle.
The present embodiments also provide a computer program product comprising a computer program that, when executed by a processor, implements any of the above-described methods for determining environment information for an autonomous vehicle.
A system for determining environment information of an automatic driving vehicle comprises an edge cloud server and the vehicle; the edge cloud server is used for executing any one method of the methods shown in the figures 1 and 2, and the vehicle is used for executing any one method of the methods shown in the figures 3 and 4.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (11)

1. A method of determining environmental information for an autonomous vehicle, applied to an edge cloud server, the method comprising:
receiving environment perception information sent by a vehicle;
acquiring the running information of the vehicle included in the environment perception information, and correcting the running information of the vehicle according to the delay information of the received environment perception information to obtain the corrected running information of the vehicle;
acquiring external perception information included in each environment perception information, and determining correction perception information of each object in the environment according to each external perception information;
determining environment information of an area covered by the edge cloud server according to the corrected running information of the vehicle and the corrected perception information of the objects;
and feeding back environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server.
2. The method of claim 1, wherein determining the corrective perceptual information for each object in the environment based on each external perceptual information comprises:
and if the plurality of external perception information comprise perception information of the same object, determining the correction perception information of the object according to the plurality of perception information of the object.
3. The method of claim 2, wherein determining the corrective perceptual information about the subject based on a plurality of perceptual information about the subject comprises:
comparing the plurality of perception information of the object to obtain a comparison result;
if the comparison result represents that the difference between the perception information is smaller than a preset difference value, taking the average value of the perception information as the correction perception information of the object;
and if the comparison result represents that abnormal perception information exists, and the difference between the abnormal perception information and at least two other perception information is larger than a preset difference value, taking the average value of other perception information except the abnormal perception information as the correction perception information of the object.
4. The method of claim 1, wherein feeding back environment information outside a vehicle to the vehicle according to the vehicle running information of the vehicle and the environment information of the area covered by the edge cloud server comprises:
and acquiring environmental information within a preset distance from the position of the vehicle from the environmental information of the area covered by the edge cloud server, and sending the acquired environmental information to the vehicle.
5. A method of obtaining environmental information of an autonomous vehicle, applied to the vehicle, the method comprising:
acquiring and sending environment perception information to an edge cloud server so that the edge cloud server acquires own vehicle running information included in the environment perception information, corrects the own vehicle running information according to delay information of the received environment perception information to obtain corrected own vehicle running information, acquires external perception information included in each environment perception information, determines correction perception information of each object in the environment according to each external perception information, and determines environment information of an area covered by the edge cloud server according to each corrected own vehicle running information and each object correction perception information;
receiving environment information outside the vehicle sent by an edge cloud server, wherein the environment information outside the vehicle is acquired from environment information of an area covered by the edge cloud server based on the running information of the vehicle.
6. An apparatus for determining environment information of an autonomous vehicle, applied to an edge cloud server, the apparatus comprising:
the receiving unit is used for receiving environment perception information sent by a vehicle;
the fusion unit is used for carrying out fusion processing on the received environment perception information to obtain environment information of an area covered by the edge cloud server;
the sending unit is used for feeding back environment information outside the vehicle to the vehicle according to the running information of the vehicle and the environment information of the area covered by the edge cloud server;
wherein, fuse the unit and include:
the self-driving information correction module is used for acquiring self-driving information included in the environment perception information, correcting the self-driving information according to delay information of the received environment perception information, and obtaining corrected self-driving information;
the external perception information correction module is used for acquiring external perception information included in each environment perception information and determining correction perception information of each object in the environment according to each external perception information;
and the environment information determining module is used for determining the environment information of the area covered by the edge cloud server according to the corrected running information of the vehicle and the corrected perception information of the objects.
7. An apparatus for acquiring environment information of an autonomous vehicle, applied to a vehicle, the apparatus comprising:
the system comprises a sending unit, a boundary cloud server and a processing unit, wherein the sending unit is used for obtaining and sending environment perception information to the boundary cloud server so that the boundary cloud server obtains the self-vehicle running information included in the environment perception information, corrects the self-vehicle running information according to delay information for receiving the environment perception information to obtain corrected self-vehicle running information, obtains external perception information included in each environment perception information, determines correction perception information of each object in the environment according to each external perception information, and determines environment information of an area covered by the boundary cloud server according to each corrected self-vehicle running information and each object correction perception information;
the receiving unit is used for receiving environment information outside the vehicle, which is sent by the edge cloud server and is acquired from environment information of an area covered by the edge cloud server based on the running information of the vehicle.
8. An electronic device comprising a memory and a processor; wherein the content of the first and second substances,
the memory for storing a computer program;
the processor is configured to read the computer program stored in the memory and execute the method of any one of claims 1-4 or 5 according to the computer program in the memory.
9. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the method of any one of claims 1-4 or 5.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, carries out the method of any one of the preceding claims 1-4 or 5.
11. A system for obtaining environmental information of an autonomous vehicle, comprising an edge cloud server, a vehicle; the edge cloud server is used for executing the method of any one of the preceding claims 1 to 4, and the vehicle is used for executing the method of any one of the preceding claims 5.
CN202110833215.1A 2021-07-22 2021-07-22 Method, apparatus, and program product for determining environment information of autonomous vehicle Active CN113581202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110833215.1A CN113581202B (en) 2021-07-22 2021-07-22 Method, apparatus, and program product for determining environment information of autonomous vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110833215.1A CN113581202B (en) 2021-07-22 2021-07-22 Method, apparatus, and program product for determining environment information of autonomous vehicle

Publications (2)

Publication Number Publication Date
CN113581202A CN113581202A (en) 2021-11-02
CN113581202B true CN113581202B (en) 2022-07-08

Family

ID=78249303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110833215.1A Active CN113581202B (en) 2021-07-22 2021-07-22 Method, apparatus, and program product for determining environment information of autonomous vehicle

Country Status (1)

Country Link
CN (1) CN113581202B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115662168A (en) * 2022-10-18 2023-01-31 浙江吉利控股集团有限公司 Environment sensing method and device and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11269352B2 (en) * 2017-12-15 2022-03-08 Baidu Usa Llc System for building a vehicle-to-cloud real-time traffic map for autonomous driving vehicles (ADVS)
DE102018221179A1 (en) * 2018-12-06 2020-06-10 Robert Bosch Gmbh Method and system for determining certain status information for at least one geographical position using autonomous or semi-autonomous vehicles
CN110422177A (en) * 2019-07-08 2019-11-08 浙江吉利汽车研究院有限公司 A kind of control method for vehicle, apparatus and system
KR102241296B1 (en) * 2019-08-26 2021-04-16 엘지전자 주식회사 Method and apparatus for data sharing using mec server in autonomous driving system
CN112530156A (en) * 2019-09-18 2021-03-19 中移智行网络科技有限公司 Intelligent network automobile open road system based on edge calculation and construction method

Also Published As

Publication number Publication date
CN113581202A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
CN110544376B (en) Automatic driving assistance method and device
CN109739236B (en) Vehicle information processing method and device, computer readable medium and electronic equipment
US10349011B2 (en) System and method for improved obstacle awareness in using a V2X communications system
CN109389832B (en) System and method for improving obstacle awareness using a V2X communication system
JP6424761B2 (en) Driving support system and center
EP3213260B1 (en) Method and device for localizing a vehicle in the environment thereof
DE102019131118A1 (en) SYSTEM AND METHOD FOR EVALUATING THE OPERATION OF ENVIRONMENTAL DETECTION SYSTEMS OF VEHICLES
CN110570674A (en) Vehicle-road cooperative data interaction method and system, electronic equipment and readable storage medium
US9230441B2 (en) Apparatus for gathering surroundings information of vehicle
DE102016205972A1 (en) Method for the autonomous or semi-autonomous execution of a cooperative driving maneuver
US20200168094A1 (en) Control device, control method, and program
CN104065920A (en) Vehicle monitoring and tracking method, system and server
EP2766891A1 (en) Method for operating a driver assistance system and method for processing vehicle environment data
CN111301427A (en) Method and driver assistance system for determining a lane and vehicle
CN113581202B (en) Method, apparatus, and program product for determining environment information of autonomous vehicle
CN112735162A (en) Vehicle scheduling method, device, system, equipment and storage medium
CN114506323B (en) Formation vehicle control method, device, equipment and medium
DE102018202966A1 (en) Method for operating at least one automated vehicle
US20210150889A1 (en) Determination and use of cluster-based stopping points for motor vehicles
CN110940346B (en) High-precision map processing method and device for automatic driving lane changing
CN116010854B (en) Abnormality cause determination method, abnormality cause determination device, electronic device and storage medium
US20210312800A1 (en) A method for controlling vehicles
US20230032741A1 (en) Road model generation method and device
CN109326118B (en) Motorcade position prediction method and device
CN111640321B (en) Congestion relieving method based on edge calculation and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant