WO2023025061A1 - 数据处理方法及装置 - Google Patents

数据处理方法及装置 Download PDF

Info

Publication number
WO2023025061A1
WO2023025061A1 PCT/CN2022/113648 CN2022113648W WO2023025061A1 WO 2023025061 A1 WO2023025061 A1 WO 2023025061A1 CN 2022113648 W CN2022113648 W CN 2022113648W WO 2023025061 A1 WO2023025061 A1 WO 2023025061A1
Authority
WO
WIPO (PCT)
Prior art keywords
coverage
information
roadside
area
communication
Prior art date
Application number
PCT/CN2022/113648
Other languages
English (en)
French (fr)
Inventor
费雯凯
刘建琴
伍勇
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22860407.0A priority Critical patent/EP4379570A1/en
Publication of WO2023025061A1 publication Critical patent/WO2023025061A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Definitions

  • This application relates to the field of electronic maps, in particular to a data processing method and device.
  • sensors play a very important role in assisted driving and automatic driving of smart cars.
  • Various sensors installed on the car such as millimeter-wave radar, lidar, ultrasonic radar or camera, etc., can perceive the surrounding environment, identify and track moving objects, and identify static scenes (such as lanes) while the car is driving. lines, signs).
  • sensors can detect possible dangers in advance and remind the driver in time, or assist the driver or automatically take measures to avoid dangers, effectively increasing the safety and comfort of car driving.
  • High-definition map also known as high-definition map or high-precision map, as one of the key capabilities to realize automatic driving, will become an effective supplement to the existing sensors of automatic driving and improve the automatic driving of vehicles.
  • Decision security Compared with traditional navigation maps, high-precision maps for autonomous driving have higher requirements in all aspects, and can cooperate with sensors and algorithms to provide support for decision-making.
  • the outside world will dynamically change and affect the driving of the vehicle. Therefore, in addition to static layers, high-precision maps increasingly need more dynamic information to meet the development needs of the transportation field.
  • the richness of existing map content cannot fully meet the needs of future use.
  • the embodiment of the present application provides a data processing method and device.
  • a new type of map information is added to the map, that is, the coverage information of roadside equipment, which improves the richness of map information and can meet higher-level map use requirements. .
  • the embodiment of the present application provides a data processing method, the method comprising:
  • the coverage information includes coverage area information for indicating at least one coverage area of the roadside equipment and coverage capability for indicating the at least one coverage area of the roadside equipment coverage information;
  • the coverage information is stored as map data.
  • the coverage information of the roadside equipment is maintained in the map, which satisfies the needs of users. Later, when other devices use the information provided by the roadside device, the coverage area of the roadside device and the coverage capability within the coverage area can be obtained from the map, providing a reference for how to use the information provided by the roadside device. For example, through the coverage information, it is possible to more accurately determine the confidence level of the perception results of the roadside equipment in a certain area, or determine the robustness of the communication connection with the roadside equipment in a certain area, which improves the automatic The reliability of driving or assisting driving.
  • acquiring the coverage information of the roadside equipment may be generating the coverage information of the roadside equipment.
  • cloud devices, roadside devices, and terminal devices all have information generating capabilities, in this case, the subject of execution of the method can be cloud devices, roadside devices or terminal devices, including but not limited to cloud map servers, application servers, and roadside devices.
  • Devices such as Road Side Unit (RSU), edge processor (multi-access edge computing, MEC), vehicle or portable terminal, or components, chips, software modules or hardware modules within these devices.
  • RSU Road Side Unit
  • MEC multi-access edge computing
  • Obtaining the coverage information of the roadside equipment may also be receiving the coverage information of the roadside equipment.
  • the receiving is a receiving operation based on wireless communication or wired communication between devices.
  • the execution subject of the method may be a cloud device, a roadside device or a terminal device, including but not limited to a map server in the cloud, an application Servers, road side units (Road Side Unit, RSU), edge processors (multi-access edge computing, MEC), vehicles or portable terminals and other equipment, application scenarios include but not limited to between vehicles, roads, vehicles Information transmission between clouds or between vehicles and roads; there is another case where the receiving operation is called by the module based on the bus, wiring, interface or parameters in the device.
  • the execution subject of this method can be the components in the above-mentioned device , chip, software module or hardware module.
  • storing the coverage information as map data refers to storing the coverage information in the map database as information carried in the map in a compiled form or storage format of other information in the map.
  • the execution body of the method can be located in the cloud, the roadside or the terminal, and the map data can be correspondingly stored in the storage medium of the cloud, the roadside or the terminal.
  • using the coverage information to generate a map or update a map includes:
  • the map may be a high-precision map.
  • the coverage information further includes tile identifiers.
  • the coverage information can be associated with the tiles through the identification of the tiles, which can facilitate the maintenance of the coverage information by using the map data management method.
  • a tile can be understood as a rectangular grid image that cuts a map within a certain range into several rows and columns according to a certain size and format, and different map resolutions.
  • the sliced rectangular grid image is called Tile.
  • a tile of a certain level is composed of 4 tiles of the corresponding higher level.
  • tile 1 is a tile of a certain level in the map.
  • Cross-cutting tile 1 can further generate 4 tiles of one level higher than tile 1, and the identifiers are 1-00 and 1-01 respectively. , 1-10 and 1-11.
  • the geographic coverage of tile 1 is the geographic coverage of tile 1-00, the geographic coverage of tile 1-01, the geographic coverage of tile 1-10 and the geographic coverage of tile 1-11 the union of .
  • the coverage information further includes an identifier of the roadside device.
  • the at least one coverage area includes M communication coverage areas and N perception coverage areas, where the M and the N are natural numbers, and the M and the N are not simultaneously is 0.
  • the coverage area may include one or more communication coverage areas, may also include one or more perception coverage areas, and may include both communication coverage areas and perception coverage areas.
  • the communication coverage area is used to reflect the communication capability of the roadside equipment
  • the perception coverage area is used to reflect the perception capability of the roadside equipment
  • the at least one coverage area is divided into levels according to the coverage capability of the at least one coverage area.
  • the coverage information includes coverage area information for indicating the M communication coverage areas and coverage capability information for indicating the coverage capabilities of the roadside equipment in the M communication coverage areas, so The M communication coverage areas are classified according to the coverage capabilities of the M communication coverage areas, wherein M is greater than 1.
  • the coverage information includes coverage area information for indicating the N perceptual coverage areas and coverage information for indicating the coverage capabilities of the roadside equipment in the N perceptual coverage areas
  • the coverage capability information of the N sensing coverage areas is divided into levels according to the coverage capabilities of the N sensing coverage areas, where N is greater than 1.
  • At least one of the N sensing coverage areas corresponds to a sensing device group, which is embodied in that the N sensing coverage areas include a multi-device sensing coverage area, and the multi-device sensing The coverage area and the coverage capability of the roadside device in the multi-device sensing coverage area are determined according to the coverage capabilities of multiple sensing devices related to the roadside device.
  • the multiple sensing devices related to the roadside device may be multiple sensing devices included in the roadside device, or be associated with multiple sensing devices that send sensing information to the roadside device.
  • the sensing coverage area of the roadside device may correspond to an independent sensing device, or may correspond to a sensing device group. Wherein, the sensing device group includes one or more sensing devices related to the roadside device.
  • the sensing device group may include a lidar and a camera, and the information sensed by the lidar and the camera may be fused to obtain the sensing coverage area of the fused sensing capability and the coverage capability corresponding to the sensing coverage area.
  • the N sensing coverage areas respectively correspond to N sensing device groups; the coverage information further includes: identifiers of the sensing device groups.
  • the coverage information may include the identification of the sensing device group, so that the sensing coverage information structure is clearer, and it is easy to use and manage.
  • the N sensing coverage areas respectively correspond to N sensing devices; the coverage information further includes: identifiers of the sensing devices.
  • the coverage information further includes: the identification of the sensing device group and the identification of the sensing device .
  • the coverage capability of a sensing device group may be obtained by fusing the sensing capabilities of multiple sensing devices.
  • the fused coverage capabilities are divided into areas according to levels, so as to obtain the coverage area corresponding to the sensing device group.
  • the coverage area of the sensing device group may be obtained according to the union of multiple sub-coverage areas, and each sub-coverage area may correspond to a sensing device in the sensing device group.
  • the roadside device is related to the first sensing device and the second sensing device, and the N sensing coverage areas include the first coverage area of the first sensing device and the second sensing device.
  • the second coverage area of the second sensing device, the coverage capability information includes the first coverage capability information for indicating the coverage capability of the first sensing device in the first coverage area and the first coverage capability information for indicating the second sensing device Second coverage capability information of the coverage capability of the device in the second coverage area.
  • the coverage information further includes information about a blind area
  • the blind area includes a communication blind area, a perception blind area, or a communication blind area and a perception blind area.
  • the coverage capability information is used to indicate at least one of the following:
  • the coverage capability information In the embodiment of the present application, several types of content (or indicators) indicated by the coverage capability information are exemplarily given.
  • the coverage capability information By using the coverage capability information to indicate one or more of the above-mentioned contents, the design rationality of the coverage information can be improved, so that It is convenient for subsequent use.
  • the vehicle can communicate with roadside equipment at any time during driving, and the communication status can be indicated according to the communication stability of roadside equipment, which is convenient for timely planning and adjustment of communication requirements with roadside equipment.
  • the coverage capability information is used to indicate at least one of the following:
  • the strategy of automatic driving is inseparable from the perception results.
  • the confidence of the perception results can be determined according to the correct rate and recall rate of the perception results indicated by the perception ability, which can improve the reliability of the automatic driving strategy.
  • the coverage capability information indicates the coverage capabilities in various environments.
  • the coverage areas corresponding to the coverage capabilities may be different under different environments such as sunny days, rainy days, and foggy weather. For another example, at different times such as day and night, and under different temperature, humidity, and brightness conditions, the areas corresponding to the coverage capabilities may be different. Through the coverage areas corresponding to various capabilities in various environments, when the coverage information is used later, the scene factors can be reasonably considered to improve the accuracy of the coverage information.
  • the coverage area is a section section of a road or a section section of a lane, which is more convenient for assisting driving based on coverage information.
  • the overlay information is displayed on a display interface.
  • the display interface includes, but is not limited to, a display screen on a vehicle, a display screen on a portable terminal, or a projected display screen.
  • the display method may be a graphic interface display, for example, the coverage area is superimposed on a map display interface, or the coverage capability corresponding to the coverage area is further displayed; the display method may also be text display.
  • the coverage information is sent.
  • the map generating side device may carry the coverage information in a map data packet and send it to the map using side device.
  • the coverage information is used to perform information processing or generate a control signal for controlling the vehicle. For example:
  • the vehicle When the vehicle is located in a certain coverage area, according to the coverage area indicated by the coverage information and the coverage capability in the coverage area, determine the safety level of the vehicle; or determine the confidence level of the sensing result from the roadside device; or The first reminder message is triggered to remind the user to turn on the automatic driving function of the vehicle or the assisted driving function of the vehicle; or the second reminder message is triggered to remind the user to take over the vehicle.
  • a data processing device provided in an embodiment of the present application includes:
  • An acquiring unit configured to acquire coverage information of roadside equipment, where the coverage information includes coverage area information for indicating at least one coverage area of the roadside equipment and information for indicating that the roadside equipment is in the at least one coverage area Coverage capability information of the coverage capability in the area;
  • a storage unit configured to store the coverage information as map data.
  • the coverage information of the roadside equipment is maintained in the map, which satisfies the needs of users. Later, when other devices use the information provided by the roadside device, the coverage area of the roadside device and the coverage capability within the coverage area can be obtained from the map, providing a reference for how to use the information provided by the roadside device. For example, through the coverage information, it is possible to more accurately determine the confidence level of the perception results of the roadside equipment in a certain area, or determine the robustness of the communication connection with the roadside equipment in a certain area, which improves the automatic The reliability of driving or assisting driving.
  • the obtaining unit may be a processing unit for generating the coverage information of the roadside equipment.
  • the data processing device in this case can be cloud devices, roadside devices or terminal devices, including but not limited to cloud map servers, application servers, roadside Unit (Road Side Unit, RSU), edge processor (multi-access edge computing, MEC), vehicle or portable terminal and other devices, or components, chips, software modules or hardware modules in these devices.
  • the acquisition unit may also be a communication unit for receiving coverage information of roadside equipment.
  • the receiving is a receiving operation based on wireless communication or wired communication between devices.
  • the data processing device can be a cloud device, a roadside device or a terminal device, including but not limited to a cloud map server, an application server, Road side unit (Road Side Unit, RSU), edge processor (multi-access edge computing, MEC), vehicles or portable terminals and other equipment, application scenarios include but not limited to between vehicles, between roads, between vehicles and clouds
  • the receiving operation is called by the module based on the bus, wiring, interface or parameters in the device.
  • the data processing device can be a component, chip, software in the above-mentioned device modules or hardware modules.
  • storing the coverage information as map data refers to storing the coverage information in the map database as information carried in the map in a compiled form or storage format of other information in the map.
  • the execution body of the method can be located in the cloud, the roadside or the terminal, and the map data can be correspondingly stored in the storage medium of the cloud, the roadside or the terminal.
  • the processing unit included in the device uses the coverage information to generate a map or update a map.
  • a layer in the map may be generated or updated according to the coverage information.
  • the map may be a high-precision map.
  • the coverage information further includes tile identifiers.
  • the coverage information can be associated with the tiles through the identification of the tiles, which can facilitate the maintenance of the coverage information by using the map data management method.
  • the coverage information further includes an identifier of the roadside device.
  • the at least one coverage area includes M communication coverage areas and N perception coverage areas, where the M and the N are natural numbers, and the M and the N are not simultaneously is 0.
  • the coverage area may include one or more communication coverage areas, may also include one or more perception coverage areas, and may include both communication coverage areas and perception coverage areas.
  • the communication coverage area is used to reflect the communication capability of the roadside equipment
  • the perception coverage area is used to reflect the perception capability of the roadside equipment
  • the at least one coverage area is divided into levels according to the coverage capability of the at least one coverage area.
  • the coverage information includes coverage area information for indicating the M communication coverage areas and coverage capability information for indicating the coverage capabilities of the roadside equipment in the M communication coverage areas, so The M communication coverage areas are classified according to the coverage capabilities of the M communication coverage areas, wherein M is greater than 1.
  • the coverage information includes coverage area information for indicating the N perceptual coverage areas and coverage information for indicating the coverage capabilities of the roadside equipment in the N perceptual coverage areas
  • the coverage capability information of the N sensing coverage areas is divided into levels according to the coverage capabilities of the N sensing coverage areas, where N is greater than 1.
  • At least one of the N sensing coverage areas corresponds to a sensing device group, which is embodied in that the N sensing coverage areas include a multi-device sensing coverage area, and the multi-device sensing The coverage area and the coverage capability of the roadside device in the multi-device sensing coverage area are determined according to the coverage capabilities of multiple sensing devices related to the roadside device.
  • the multiple sensing devices related to the roadside device may be multiple sensing devices included in the roadside device, or be associated with multiple sensing devices that send sensing information to the roadside device.
  • the sensing coverage area of the roadside device may correspond to an independent sensing device, or may correspond to a sensing device group. Wherein, the sensing device group includes one or more sensing devices related to the roadside device.
  • the sensing device group may include a lidar and a camera, and the information sensed by the lidar and the camera may be fused to obtain the sensing coverage area of the fused sensing capability and the coverage capability corresponding to the sensing coverage area.
  • the N sensing coverage areas respectively correspond to N sensing device groups; the coverage information further includes: identifiers of the sensing device groups.
  • the coverage information may include the identification of the sensing device group, so that the sensing coverage information structure is clearer, and it is easy to use and manage.
  • the N sensing coverage areas respectively correspond to N sensing devices; the coverage information further includes: identifiers of the sensing devices.
  • the coverage information further includes: the identification of the sensing device group and the identification of the sensing device .
  • the coverage capability of a sensing device group may be obtained by fusing the sensing capabilities of multiple sensing devices.
  • the fused coverage capabilities are divided into areas according to levels, so as to obtain the coverage area corresponding to the sensing device group.
  • the coverage area of the sensing device group may be obtained according to the union of multiple sub-coverage areas, and each sub-coverage area may correspond to a sensing device in the sensing device group.
  • the roadside device is related to the first sensing device and the second sensing device, and the N sensing coverage areas include the first coverage area of the first sensing device and the second sensing device.
  • the second coverage area of the second sensing device, the coverage capability information includes the first coverage capability information for indicating the coverage capability of the first sensing device in the first coverage area and the first coverage capability information for indicating the second sensing device Second coverage capability information of the coverage capability of the device in the second coverage area.
  • the coverage information further includes information about a blind area
  • the blind area includes a communication blind area, a perception blind area, or a communication blind area and a perception blind area.
  • the coverage capability information is used to indicate at least one of the following:
  • the coverage capability information In the embodiment of the present application, several types of content (or indicators) indicated by the coverage capability information are exemplarily given.
  • the coverage capability information By using the coverage capability information to indicate one or more of the above-mentioned contents, the design rationality of the coverage information can be improved, so that It is convenient for subsequent use.
  • the vehicle can communicate with the roadside equipment at any time during driving, and the communication status can be indicated according to the communication stability of the roadside equipment, which is convenient for timely planning and adjustment of the communication requirements with the roadside equipment.
  • the coverage capability information is used to indicate at least one of the following:
  • the strategy of automatic driving is inseparable from the perception results.
  • the confidence of the perception results can be determined according to the correct rate and recall rate of the perception results indicated by the perception ability, which can improve the reliability of the automatic driving strategy.
  • the coverage capability information indicates the coverage capabilities in various environments.
  • the coverage areas corresponding to the coverage capabilities may be different under different environments such as sunny days, rainy days, and foggy weather. For another example, at different times such as day and night, and under different temperature, humidity, and brightness conditions, the areas corresponding to the coverage capabilities may be different. Through the coverage areas corresponding to various capabilities in various environments, when the coverage information is used later, the scene factors can be reasonably considered to improve the accuracy of the coverage information.
  • the coverage area is a section section of a road or a section section of a lane, which is more convenient for assisting driving based on coverage information.
  • the device includes a display unit for displaying the overlay information on a display interface.
  • the display interface includes, but is not limited to, a display screen on a vehicle, a display screen on a portable terminal, or a projected display screen.
  • the display method may be a graphic interface display, for example, the coverage area is superimposed on a map display interface, or the coverage capability corresponding to the coverage area is further displayed; the display method may also be text display.
  • the apparatus includes a communication unit, configured to send the coverage information.
  • the map generating side device may carry the coverage information in a map data packet and send it to the map using side device.
  • the device includes a processing unit, which utilizes the overlay information to perform information processing or generate a control signal for controlling the vehicle.
  • a processing unit which utilizes the overlay information to perform information processing or generate a control signal for controlling the vehicle.
  • the vehicle When the vehicle is located in a certain coverage area, according to the coverage area indicated by the coverage information and the coverage capability within the coverage area, determine the safety level of the vehicle; or determine the confidence level of the sensing result from the roadside device; or The first reminder message is triggered to remind the user to turn on the automatic driving function of the vehicle or the assisted driving function of the vehicle; or the second reminder message is triggered to remind the user to take over the vehicle.
  • an embodiment of the present application provides a data processing device, which may include a processor, configured to implement the data processing method described in the foregoing first aspect or any possible implementation manner of the foregoing first aspect.
  • the device may further include a memory, the memory is coupled to the processor, and when the processor executes the computer program stored in the memory, the above first aspect or any of the possible aspects of the above first aspect may be realized.
  • the data processing method described in the implementation mode may be realized.
  • the device may further include a communication interface, where the communication interface is used to receive computer-executed instructions and transmit them to the processor, and the processor is used to execute the computer-executed instructions, so that The data processing apparatus executes the data processing method described in the foregoing first aspect or any possible implementation manner of the foregoing first aspect.
  • the computer program in the memory in the embodiment of the present application can be pre-stored or stored after being downloaded from the network when using the device, and the source of the computer program in the memory is not specifically limited in the embodiment of the present application.
  • the coupling in the embodiments of the present application is an indirect coupling or connection between devices, units or modules, which may be in electrical, mechanical or other forms, and is used for information exchange between devices, units or modules.
  • an embodiment of the present application provides a computer-readable storage medium, the above-mentioned computer-readable storage medium stores a computer program, and the above-mentioned computer program is executed by a processor to realize the above-mentioned first aspect or any possibility of the above-mentioned first aspect
  • the data processing method described in the implementation mode is not limited to:
  • the embodiment of the present application provides a computer program product.
  • the above computer program product is read and executed by a processor, the data processing method described in the above first aspect or any possible implementation of the above first aspect will be executed.
  • the embodiment of the present application provides a vehicle, the vehicle includes the data processing device described in the above second aspect or any possible implementation of the above second aspect, or includes the above third aspect or any of the above third aspects A data processing device described in a possible implementation manner.
  • the embodiment of the present application provides a map, the map includes coverage information of roadside equipment, and the coverage information includes coverage area information used to indicate at least one coverage area of the roadside equipment and used to indicate Coverage capability information of the coverage capability of the roadside device in the at least one coverage area.
  • the map in the embodiment of the present invention is a map product, specifically, it can be a map data product carrying map information, such as a map update data package; or it can be a map application product loaded with map information, such as a product that can be installed on a vehicle or a portable terminal or a map display product that presents map information, such as a paper map or an electronic navigator.
  • the coverage information of the roadside equipment is maintained in the map, which satisfies the needs of users. Later, when other devices use the information provided by the roadside device, the coverage area of the roadside device and the coverage capability within the coverage area can be obtained from the map, providing a reference for how to use the information provided by the roadside device. For example, through the coverage information, it is possible to more accurately determine the confidence level of the perception results of the roadside equipment in a certain area, or determine the robustness of the communication connection with the roadside equipment in a certain area, which improves the automatic The reliability of driving or assisting driving.
  • the map may be a high-precision map.
  • the coverage information further includes tile identifiers.
  • the coverage information can be associated with the tiles through the identification of the tiles, which can facilitate the maintenance of the coverage information by using the map data management method.
  • the coverage information further includes an identifier of the roadside device.
  • the at least one coverage area includes M communication coverage areas and N perception coverage areas, where the M and the N are natural numbers, and the M and the N are not simultaneously is 0.
  • the coverage area may include one or more communication coverage areas, may also include one or more perception coverage areas, and may include both communication coverage areas and perception coverage areas.
  • the communication coverage area is used to reflect the communication capability of the roadside equipment
  • the perception coverage area is used to reflect the perception capability of the roadside equipment
  • the at least one coverage area is divided into levels according to the coverage capability of the at least one coverage area.
  • the coverage information includes coverage area information for indicating the M communication coverage areas and coverage capability information for indicating the coverage capabilities of the roadside equipment in the M communication coverage areas, so The M communication coverage areas are classified according to the coverage capabilities of the M communication coverage areas, wherein M is greater than 1.
  • the coverage information includes coverage area information for indicating the N perceptual coverage areas and coverage information for indicating the coverage capabilities of the roadside equipment in the N perceptual coverage areas
  • the coverage capability information of the N sensing coverage areas is divided into levels according to the coverage capabilities of the N sensing coverage areas, where N is greater than 1.
  • At least one of the N sensing coverage areas corresponds to a sensing device group, which is embodied in that the N sensing coverage areas include a multi-device sensing coverage area, and the multi-device sensing The coverage area and the coverage capability of the roadside device in the multi-device sensing coverage area are determined according to the coverage capabilities of multiple sensing devices related to the roadside device.
  • the above illustrates possible designs for the sensing coverage area.
  • the plurality of sensing devices related to the roadside device may be a plurality of sensing devices included in the roadside device, or be associated with a plurality of sensing devices that send sensing information to the roadside device.
  • the sensing coverage area of the roadside device may correspond to an independent sensing device, or may correspond to a sensing device group. Wherein, the sensing device group includes one or more sensing devices related to the roadside device.
  • the sensing device group may include a lidar and a camera, and the information sensed by the lidar and the camera may be fused to obtain the sensing coverage area of the fused sensing capability and the coverage capability corresponding to the sensing coverage area.
  • the N sensing coverage areas respectively correspond to N sensing device groups; the coverage information further includes: identifiers of the sensing device groups.
  • the coverage information may include the identification of the sensing device group, so that the sensing coverage information structure is clearer, and it is easy to use and manage.
  • the N sensing coverage areas respectively correspond to N sensing devices; the coverage information further includes: identifiers of the sensing devices.
  • a part of the sensing areas corresponds to sensing device groups, and the other part corresponds to sensing devices; the coverage information further includes: the identification of the sensing device group and the identification of the sensing device .
  • the coverage capability of a sensing device group may be obtained by fusing the sensing capabilities of multiple sensing devices.
  • the fused coverage capabilities are divided into areas according to levels, so as to obtain the coverage area corresponding to the sensing device group.
  • the coverage area of the sensing device group may be obtained according to the union of multiple sub-coverage areas, and each sub-coverage area may correspond to a sensing device in the sensing device group.
  • the roadside device is related to the first sensing device and the second sensing device, and the N sensing coverage areas include the first coverage area of the first sensing device and the second sensing device.
  • the second coverage area of the second sensing device, the coverage capability information includes the first coverage capability information for indicating the coverage capability of the first sensing device in the first coverage area and the first coverage capability information for indicating the second sensing device Second coverage capability information of the coverage capability of the device in the second coverage area.
  • the coverage information further includes information about a blind area
  • the blind area includes a communication blind area, a perception blind area, or a communication blind area and a perception blind area.
  • the coverage capability information is used to indicate at least one of the following:
  • the coverage capability information In the embodiment of the present application, several types of content (or indicators) indicated by the coverage capability information are exemplarily given.
  • the coverage capability information By using the coverage capability information to indicate one or more of the above-mentioned contents, the design rationality of the coverage information can be improved, so that It is convenient for subsequent use.
  • the vehicle can communicate with the roadside equipment at any time during driving, and the communication status can be indicated according to the communication stability of the roadside equipment, which is convenient for timely planning and adjustment of the communication requirements with the roadside equipment.
  • the coverage capability information is used to indicate at least one of the following:
  • the strategy of automatic driving is inseparable from the perception results.
  • the confidence of the perception results can be determined according to the correct rate and recall rate of the perception results indicated by the perception ability, which can improve the reliability of the automatic driving strategy.
  • the coverage capability information indicates the coverage capabilities in various environments.
  • the coverage areas corresponding to the coverage capabilities may be different under different environments such as sunny days, rainy days, and foggy weather. For another example, at different times such as day and night, and under different temperature, humidity, and brightness conditions, the areas corresponding to the coverage capabilities may be different. Through the coverage areas corresponding to various capabilities in various environments, when using the coverage information in the future, the scene factors can be reasonably considered to improve the accuracy of the coverage information.
  • the coverage area is a section section of a road or a section section of a lane, which is more convenient for assisting driving based on coverage information.
  • an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores the map in the seventh aspect or any implementation manner of the seventh aspect.
  • FIG. 1 is a schematic diagram of an application scenario applicable to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a perception coverage provided by an embodiment of the present application.
  • Fig. 3 is a schematic diagram of a communication coverage provided by an embodiment of the present application.
  • FIG. 4 is a schematic flow diagram of a data processing method provided in an embodiment of the present application.
  • FIG. 5A is a schematic diagram of a method for indicating a coverage area provided by an embodiment of the present application.
  • FIG. 5B is a schematic diagram of another method for indicating a coverage area provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of another method for indicating a coverage area provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a usage scenario of a data processing method provided in an embodiment of the present application.
  • FIG. 8A is a schematic diagram of another scenario provided by the embodiment of the present application.
  • FIG. 8B is a schematic diagram of a coverage area provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a blind area provided by an embodiment of the present application.
  • FIG. 10A is a schematic diagram of a map layer provided by an embodiment of the present application.
  • FIG. 10B is a schematic diagram of a map provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a data structure of coverage information provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another data structure of coverage information provided by the embodiment of the present application.
  • Fig. 13 is a schematic flowchart of another data processing method provided by the embodiment of the present application.
  • Fig. 14 is a schematic structural diagram of a data processing device provided by an embodiment of the present application.
  • Fig. 15 is a schematic structural diagram of another data processing device provided by an embodiment of the present application.
  • Fig. 16 is a schematic structural diagram of another data processing device provided by the embodiment of the present application.
  • FIG. 17 shows a flowchart of a method for generating perception capability information provided by an embodiment of the present application
  • FIG. 18A shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • FIG. 18B shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • FIG. 18C shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • Fig. 19A shows a schematic diagram of the first group of position points and corresponding trajectories in the embodiment of the present application
  • Fig. 19B shows a schematic diagram of the second group of position points and corresponding trajectories in the embodiment of the present application.
  • Fig. 19C shows a schematic diagram of matching results in the embodiment of the present application.
  • Figure 19D shows a schematic diagram of trajectory matching in the embodiment of the present application.
  • FIG. 20A shows an exemplary schematic diagram of regions to be divided in the embodiment of the present application.
  • FIG. 20B shows an exemplary schematic diagram of a grid in an embodiment of the present application.
  • Fig. 20C shows the merging result graph of the grid in the embodiment of the present application.
  • FIG. 21 shows an exemplary schematic diagram of a perception blind zone in an embodiment of the present application.
  • FIG. 22 shows a flowchart of a method for generating communication capability information provided by an embodiment of the present application
  • FIG. 23 shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • Figure 24 shows an exemplary schematic diagram of a first distribution situation
  • FIG. 25 shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • Figure 26 shows a schematic diagram of the distribution of terminal equipment
  • Fig. 27 shows an exemplary schematic diagram of a grid in an embodiment of the present application
  • Fig. 28 shows an exemplary schematic diagram of the grid merging result in the embodiment of the present application.
  • Fig. 29 shows an exemplary schematic diagram of a grid in an embodiment of the present application.
  • Fig. 30 shows an exemplary schematic diagram of the merging result of grids in the embodiment of the present application.
  • Fig. 31 shows an exemplary schematic diagram of a communication dead zone according to an embodiment of the present application.
  • FIG. 1 is a schematic diagram of an application scenario applicable to the embodiment of the present application, specifically a schematic diagram of a communication system, including roadside equipment, vehicles, and a server. in:
  • Roadside equipment which can be installed on the roadside (or intersection, roadside); roadside equipment can communicate with the server and/or vehicles to achieve multiple functional services, for example, the roadside equipment collects peripheral information and provides it to The server and/or the vehicle, the roadside device provides one or more services such as vehicle identification, electronic toll collection, and electronic point deduction for the vehicle.
  • Roadside equipment may include perception equipment (or referred to as sensors) and/or communication equipment.
  • the sensing device of the roadside device can collect surrounding information (for example, road information), and then provide vehicle-road coordination services.
  • the sensing device may include one or more of millimeter-wave radar, lidar, or visual sensors (such as cameras, etc.).
  • the roadside equipment has a certain coverage, which represents the service area that the roadside equipment can provide. Further, the coverage may include at least one of perception coverage, communication coverage and the like.
  • the roadside device can detect objects through the lidar, and the field of view of the roadside device's lidar can be regarded as a perception coverage of the roadside device.
  • FIG. 2 is a schematic diagram of a possible sensing coverage provided by the embodiment of the present application.
  • the sensing device 201 can be regarded as a roadside device or a module in the roadside device (or the sensing device 201 can be connected to roadside equipment).
  • the coverage of the sensing device 201 is shown in the figure, where different areas within the coverage correspond to different coverage capabilities.
  • FIG. 2 taking the accuracy rate to describe the coverage capability as an example, different areas within the coverage range correspond to different accuracy rates of perception results.
  • the accuracy of the sensing result is getting lower and lower, that is to say, the coverage capability is gradually weakened.
  • the communication device of the roadside device may support the roadside device to communicate with other devices (such as vehicles, cloud, or other roadside devices).
  • a communication device may receive externally transmitted data and/or transmit externally.
  • a communication device may include modules or interfaces related to wired links such as Ethernet cables, or may include modules or interfaces related to technologies such as wireless links (Wi-Fi, Bluetooth, general wireless transmission, vehicle short-range communication technology, etc.) , or include both wired link-related modules or interfaces and wireless link technology-related modules or interfaces.
  • the roadside equipment When the roadside equipment includes communication equipment (such as when the roadside equipment includes a roadside unit (roadside unit, RSU) or is a roadside unit), the roadside equipment can communicate with surrounding vehicles, other roadside equipment, Or devices such as a server or a terminal communicate, and the terminal may be an electronic device, such as a mobile phone, a portable computer, or a smart wearable device.
  • the area capable of communicating with the communication equipment in the roadside equipment can be regarded as the communication coverage area of the roadside equipment.
  • the communication equipment of the above roadside equipment may include a radio frequency part and a baseband part, the radio frequency part includes an antenna and a radio frequency circuit, and the area that the wireless signal transmitted by the roadside equipment through the antenna can reach can be regarded as the roadside equipment communication coverage.
  • the area where the roadside device can receive signals through the antenna may be regarded as the communication coverage area of the roadside device.
  • FIG. 3 is a schematic diagram of a possible communication coverage provided by the embodiment of the present application.
  • the communication device 301 can be regarded as a roadside device or a module in the roadside device (or a communication device 301 can be connected with roadside equipment).
  • the coverage of the communication device 301 is shown in the figure, where different areas within the coverage correspond to different coverage capabilities. Taking the accuracy rate to describe the coverage capability as an example, different areas correspond to different accuracy rates of communication results. Generally speaking, as the distance from the communication device 301 becomes farther and farther away, the accuracy rate of the result of data transmission during the communication process is lower and lower. That is, the communication ability is gradually weakened.
  • providing the coverage information of the roadside equipment to the vehicle can enable the vehicle to select or process the environmental information provided by the roadside equipment according to the coverage information, improve the accuracy of the environmental information used by the vehicle, and improve the driving performance of the vehicle. safety.
  • the roadside device may be an independent device or integrated into other devices.
  • the roadside device may be integrated into equipment such as smart gas stations, charging piles, smart signal lights, street lights, utility poles, or traffic signs.
  • the aforementioned roadside (or crossing, roadside) can be an outdoor road, including various main roads, auxiliary roads, elevated or temporary roads, etc., and can also be an indoor road, such as a road in an indoor parking lot. .
  • the vehicle involved in the embodiment of this application is a device that is driven by power and generally includes various subsystems, such as but not limited to a travel system, a sensor system, a control system, one or more peripheral devices, a power supply, and a user interface and so on.
  • the vehicle may also include more or fewer subsystems, and each subsystem may include multiple elements. Additionally, each subsystem and element of the vehicle can be interconnected by wire or wirelessly.
  • the vehicle in the embodiment of the present application may be a car, an electric vehicle, or a track-running vehicle, or an intelligent vehicle (such as an unmanned vehicle), an intelligent mobile robot, and the like.
  • the intelligent vehicle supports the perception of the road environment through the on-board sensor system, automatically plans the driving route and controls the vehicle to reach the predetermined target location.
  • Smart cars use computers, modern sensing, information fusion, communication, artificial intelligence machines or automatic control technologies in a concentrated manner. They are a high-tech complex integrating environmental perception, planning and decision-making, and multi-level assisted driving.
  • the smart vehicle may specifically be a car with an assisted driving system or a fully automatic driving system, a wheeled mobile robot, and the like.
  • the server can be implemented by devices such as servers, mobile terminals, hosts, virtual machines or robots.
  • the server may include one server or a server cluster composed of multiple servers.
  • the server may also be a cloud, and the cloud may include a cloud server and/or a cloud virtual machine.
  • the cloud can be deployed on a public cloud, a private cloud, or a hybrid cloud.
  • connection media including a wired link (such as an optical fiber), a wireless link, or a combination of a wired link and a wireless link, and the like.
  • the connection medium can be a wireless link
  • the wireless link adopts a short-distance connection technology, such as 802.11b/g technology, Bluetooth (Blue Tooth) technology, Zigbee (Zigbee) technology, radio frequency identification (Radio Frequency Identification, RFID) technology, ultra-wideband (Ultra Wideband, UWB) technology, wireless short-range communication (such as vehicle-mounted wireless short-distance communication) technology or vehicle networking (vehicle to everything, V2X, information exchange between vehicles and the outside world) technology, etc.
  • 802.11b/g technology Bluetooth (Blue Tooth) technology, Zigbee (Zigbee) technology, radio frequency identification (Radio Frequency Identification, RFID) technology, ultra-wideband (Ultra Wideband, UWB) technology, wireless short-range communication (such as vehicle-mounted wireless short-d
  • the wireless link adopts long-distance connection technology, such as Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Universal Mobile Telecommunications System (Universal Mobile Telecommunications System) , UMTS), LTE, or 5G and other wireless access technologies.
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • Universal Mobile Telecommunications System Universal Mobile Telecommunications System
  • UMTS Universal Mobile Telecommunications System
  • LTE Long-distance connection technology
  • the server can communicate with the vehicle to provide various services for the vehicle, such as high-definition map service, automatic driving or assisted driving service, etc.
  • the vehicle can interact with the server and use various services provided by the cloud.
  • the high-precision map service can be used to improve automatic driving or assisted driving functions, thereby improving the driving safety and travel efficiency of the vehicle.
  • the vehicle can download high-precision map data from the server to obtain a high-precision map to provide users with more accurate navigation services.
  • the vehicle can obtain high-precision maps in real time while driving, improving the safety of the vehicle's automatic driving decision-making. Due to the dynamic changes in the environment, in addition to static layers, high-precision maps increasingly need more dynamic information to meet the development needs of the transportation field.
  • Roadside equipment is an important source of dynamic information for high-precision maps.
  • roadside devices can be used to provide environmental information on the road, such as traffic light information, obstacle information, and the like.
  • the coverage of the roadside equipment is limited, and there is a certain reliability evaluation for services provided by the roadside equipment (such as perception results, communication results, etc.) in different coverage areas.
  • providing the coverage of roadside equipment to the cloud as high-precision map information can improve the service quality of high-precision maps.
  • providing coverage to the vehicle for use in determining driving strategies can improve the reliability of the vehicle's driving decisions.
  • FIG. 4 is a schematic flowchart of a data processing method provided in an embodiment of the present application.
  • the data processing method shown in FIG. 4 may be applicable to the scenario shown in FIG. 1 above.
  • the data processing method may at least include step S401 and step S402, specifically as follows:
  • Step S401 Obtain coverage information of roadside equipment.
  • the coverage information includes coverage area information and coverage capability information, wherein the coverage area information is used to indicate at least one coverage area of the roadside device, and the coverage capability information is used to indicate the coverage capability of the roadside device in the at least one coverage area.
  • this step is performed by a data processing device, and the data processing device may be located at a server, a roadside device or a vehicle.
  • the roadside equipment can be installed on the roadside (or intersection, roadside).
  • the roadside equipment has a certain coverage, which represents the service area that the roadside equipment can provide. Further, the coverage may include at least one of perception coverage, communication coverage and the like. For descriptions of roadside devices, coverage, communication coverage, perception coverage, etc., reference may be made to the foregoing description of FIG. 1 , and details will not be repeated here.
  • the coverage area can be divided into different coverage area types such as perception coverage area and communication coverage area.
  • the coverage area can be indicated by geometric shapes, coordinates, or relative positions. Here are three possible designs:
  • the coverage area can be indicated by the relative position of the end points of the coverage area with respect to the roadside equipment.
  • FIG. 5A is a schematic diagram of a possible method for indicating the coverage area provided by the embodiment of the present application.
  • the roadside device (or the sensing device associated with the roadside device) 501 As a reference point, the relative position of point A, point B, point C, and point D with respect to the roadside equipment 501 can indicate the coverage area with a correct rate ⁇ 90%.
  • the coverage area with a correct rate ⁇ 75% can be determined. Other coverage areas can be deduced similarly.
  • the coverage area of 90% ⁇ accurate rate ⁇ 75% can be determined by the location of the roadside equipment 501 and the relative positions of points C, D, E, and F with respect to the roadside equipment 501 .
  • FIG. 5B is a schematic diagram of another possible method for indicating the coverage area provided by the embodiment of the present application, through the positions of point I, point J, point K, and point L relative to the reference point O, the correct rate can be determined ⁇ 90% coverage area. Similarly, through the positions of point I, point J, point M, and point N relative to the reference point O, the coverage area with a correct rate ⁇ 75% can be determined. Other coverage areas can be deduced similarly.
  • the coverage area of 90% ⁇ correct rate ⁇ 75% can be determined by the positions of point K, point L, point M, and point N relative to the reference point O.
  • Design 3 Geometric shapes are used to indicate the coverage area. Referring to FIG. 3 , taking the shape of the communication coverage area as a circle as an example, the coverage area can be indicated by the position and radius of the communication device 301 (or the roadside device where the communication device is located). For example, if the accuracy rate is 98% within a radius of 15 meters (m), the coverage area is a circle with the center of the communication device 301 (or a pre-configured center point) and a radius of 15 m.
  • the shape of the communication coverage area is not limited, for example, it may also be a fan-shaped area.
  • longitude, latitude, etc. may also be used to describe the endpoint, thereby indicating the coverage area.
  • the coverage area may also be indicated by a six-degree-of-freedom pose, etc., which will not be described here.
  • the coverage area may also be processed by cutting, splicing, intersection, union, and the like.
  • a coverage area may include a section of a road or a section of a lane.
  • FIG. 6 is a schematic diagram of another possible indication method of a coverage area provided by an embodiment of the present application. It can be seen that the coverage area of the roadside device 601 is the intersection of the actual coverage area of the roadside device 601 and the road section. It should be understood that the same applies to the case where the coverage area includes a section section of a lane. Similarly, in some implementation situations, the coverage area may also include the area within a certain range from the edge of the road. Or in some implementation situations, the coverage area may also include sections such as sidewalks and auxiliary roads.
  • the aforementioned coverage capability is specifically the coverage capability of the roadside device within the coverage area, and the coverage capability can be described by coverage capability information.
  • the perception coverage capability is the coverage capability of the roadside device in the perception coverage area
  • the communication coverage capability is the coverage capability of the roadside device in the communication coverage area.
  • the coverage capability information may indicate different indicators, or be referred to as content.
  • the coverage capability is the coverage capability of the roadside equipment in the communication coverage area, and the coverage capability information can be used to indicate at least one of the following contents (or indicators): data accuracy rate, packet loss rate, At least one of communication delay, communication stability, or signal strength, etc.
  • the foregoing content may also be referred to as a basic indicator.
  • the coverage capability is the coverage capability of the roadside equipment within the perceived coverage area
  • the coverage capability information is used to indicate at least one of the following contents (or indicators): correct rate of sensing results, false detection rate At least one of , missed detection rate, recall rate, perceptual precision, perceptual average precision (Average Precision, AP), detection stability or detection position precision, etc.
  • the foregoing content may also be called a basic indicator.
  • the correct rate of perceptual results is used to indicate the ratio of detected correct results to detected results; false detection rate refers to the ratio of detected wrong results to detected results; The ratio of the detected results to the detected results; the recall rate (Recall, which can also be called the recall rate) is used to refer to the ratio of the correctly detected results to all the results (or all the results that should be detected); perception precision , Perceptual average precision can be used to evaluate the correct rate and/or recall rate; detection stability is used to indicate the ability of various detection indicators to be constant over time; detection position accuracy is used to describe the corresponding ability between the position of the perception result and the real position .
  • the recall rate is related to the real result and detection result of the sample.
  • the relationship between the real result of the sample and the test result can be in the following categories: True Positive (TP), True Negative (TN), False Positive (FP), False Negative FN (False negative, FN). True and False are used to judge whether the result is correct or not, and Positive and Negative are used to judge whether the result is positive or negative.
  • the total number of samples TP + FP + TN + FN.
  • the number of coverage areas included in the coverage information may be one or more.
  • the coverage capability information may also be one or more.
  • the coverage area may also have different coverage area types such as perception coverage area and communication coverage area.
  • the coverage information may include one type of coverage area, or may include multiple types of coverage areas, for example, include both communication coverage areas and perception coverage areas.
  • At least one coverage area indicated by the coverage information may include M communication coverage areas and N perception coverage areas, where the M and the N are natural numbers, and the M and the N are not simultaneously is 0.
  • M and the N are natural numbers, and the M and the N are not simultaneously is 0.
  • the coverage information includes coverage area information of at least two perceived coverage areas. Further, coverage capability information corresponding to the at least two sensing coverage areas may also be included. For example, please refer to Table 1. Table 1 is a possible coverage information provided by the embodiment of the present application, and the coverage information may be used to describe the perceived coverage as shown in FIG. 2 . Taking the accuracy rate to describe the coverage capability as an example, different areas correspond to different accuracy rates of communication results, wherein, within the coverage area 1, the corresponding communication result accuracy rate is ⁇ 98%.
  • Coverage capability and corresponding coverage area ⁇ Coverage ability the correct rate of perception results ⁇ 90%; coverage area: coverage area 1 ⁇ ⁇ Coverage ability: the correct rate of perception results ⁇ 75%; coverage area: coverage area 2 ⁇ ⁇ Coverage ability: the correct rate of perception results ⁇ 60%; coverage area: coverage area 3 ⁇ ...
  • the coverage capability may be represented by sensing capabilities, and the multiple sensing coverage areas may be divided according to levels of sensing capabilities.
  • the perception ability can be divided into multiple levels, such as the first level, the second level, the third level and so on. Wherein, coverage area 1 corresponds to the first level, coverage area 2 corresponds to the second level, and coverage area 3 corresponds to the third level.
  • the level of perception ability in this embodiment of the present application may be determined according to the strength of perception ability (such as the accuracy rate, etc.), and may also be pre-defined, pre-configured, or obtained according to protocol regulations.
  • the coverage information includes coverage area information of at least two communication coverage areas. Further, coverage capability information corresponding to the at least two communication coverage areas may also be included. For example, please refer to Table 2. Table 2 is a possible coverage information provided by the embodiment of the present application, and the coverage information may be used to describe the communication coverage shown in FIG. 3 . Taking the accuracy rate to describe the coverage capability as an example, different areas correspond to different accuracy rates of communication results, wherein, within the coverage area 1, the corresponding communication data accuracy rate is ⁇ 98%.
  • Coverage capability and corresponding coverage area ⁇ Coverage capability data accuracy ⁇ 98%; coverage area: coverage area 4 ⁇ ⁇ Coverage capability: data accuracy ⁇ 95%; coverage area: coverage area 5 ⁇ ...
  • the coverage capability may be represented by communication capability, and then the multiple communication coverage areas may be divided according to the level of communication capability.
  • the communication capability can be divided into multiple levels, such as the first level, the second level, the third level and so on. Wherein, coverage area 4 corresponds to the first level, and coverage area 5 corresponds to the second level.
  • the level of the communication capability may also be related to other indicators.
  • the first level is the coverage area with data accuracy ⁇ 98% and communication delay ⁇ 50ms; the second level is coverage area with data accuracy ⁇ 95% and communication delay ⁇ 70ms, etc.
  • the coverage information may not include coverage capability information.
  • the coverage information can be indexed by a data number, where the data content corresponding to the data number can be pre-configured, defined through a protocol, or obtained through negotiation.
  • the content of the data number 1001 corresponds to the communication coverage area with a correct rate>98% (ie, the coverage area 4)
  • the content of the data number 1001 corresponds to the communication coverage area with a correct rate>95% (ie, the coverage area 5 ), and so on for other content.
  • the format of the coverage information in advance, redundant coverage capability information can be reduced, and the efficiency of data transmission can be improved.
  • data numbers for indexing is only exemplary here, and in specific implementations, data numbers may also be replaced by data identifiers, bit positions, etc., which will not be repeated here.
  • the data processing device obtains the coverage information of the roadside equipment, which may specifically include the following implementation methods:
  • Implementation mode 1 The data processing device receives coverage information sent by other devices.
  • coverage information sent by other devices are two possible examples:
  • Example 1 The data processing device can receive the coverage information sent by the roadside device.
  • FIG. 7 is a schematic diagram of a scenario where this embodiment of the present application may be applicable.
  • the data processing device may be included in a vehicle 702 (or cloud 703 ), and may receive coverage information sent from a roadside device 701 .
  • the data processing device may receive the coverage information sent by the server (or cloud).
  • the data processing device can be included in the vehicle 702, and the vehicle 702 can obtain the coverage information of the roadside equipment from the cloud 703 (such as a map cloud, or an assisted driving cloud, etc.).
  • the data processing device can also be included in the roadside equipment 701, and the roadside equipment 701 can receive the coverage information sent by the vehicle 702 (or cloud 703), and the coverage information can include the coverage information of the roadside equipment 701, optional , also includes coverage information of other roadside devices (not shown in FIG. 7 ).
  • Implementation 2 The data processing device generates the coverage information of the roadside equipment.
  • the coverage information for roadside device perception capability and communication capability is introduced respectively below.
  • the coverage area of the roadside device is the perception area
  • the coverage capability information used to indicate the coverage capability of the roadside device in the at least one coverage area is the perception capability information.
  • FIG. 17 shows a flowchart of a method for generating perception capability information provided by an embodiment of the present application. As shown in Figure 17, the method includes:
  • Step S1701 obtaining roadside sensing results and multi-source fusion sensing results.
  • Step S1702 matching the roadside sensing result with the multi-source fusion sensing result to obtain matching results of multiple target locations.
  • Step S1703 Based on the matching result, generate first perception capability information of the first roadside device.
  • the first roadside device represents the roadside device whose perception capability is to be determined.
  • the first roadside device may be any roadside device.
  • the first perception capability information may represent the perception capability of the first roadside device.
  • the first perception capability information may be used to indicate the perception capability of the first roadside device, for example, an area that the first roadside device can sense and an area that cannot be sensed. Based on the matching result of the roadside sensing result and the multi-source fusion sensing result, the first sensing capability information may be generated.
  • the roadside perception result may be used to indicate the first group of location points of the traffic participant sensed by the first roadside device within a preset time period.
  • the first group of position points can be the position point of the traffic participant perceived by a sensor in the first roadside device;
  • the set of position points is a set of position points obtained after fusion in the first roadside device.
  • the multi-source fusion sensing result may be used to indicate the second group of position points obtained by fusing multiple sets of position points of the traffic participant acquired by multiple sensing devices within a preset time period.
  • the plurality of sensing devices may be the same sensing device or different sensing devices; and may be located on different carriers, and different carrier types may be the same or different, such as roadside devices, vehicles or portable terminals ( It can also be referred to as at least one of mobile terminals), that is, it can be located in multiple roadside equipment, or in multiple vehicles, or in multiple portable terminals, or in the three types of equipment: roadside equipment, vehicles and portable terminals two devices or three devices.
  • the preset time period can represent any time period, for example, the preset time period can be in units of months, weeks, days or hours, for example, the preset time period is 1 month, 1 week or 1 day, etc., the preset time period It can be set as required, which is not limited in this application. It can be understood that when the preset time period is longer and more location points of traffic participants are used, the accuracy of the obtained first perception ability information is higher.
  • the roadside perception results and multi-source fusion perception results are the perception results of traffic participants around the same roadside equipment in the same time period.
  • the roadside perception result reflects the traffic participants actually sensed by the first roadside device within a preset time period.
  • the multi-source fusion sensing results use data from multiple sensing devices, reflecting the traffic participants actually perceived by multiple sensing devices within a preset period of time. Since these sensing devices make up for each other's perspective and shortcomings, the confidence of the multi-source fusion sensing results is high, which can be used as a reference standard for the roadside sensing results to determine whether the roadside sensing results are accurate, so as to determine the first road. Perception capabilities of side devices.
  • the first roadside device has better perceived the traffic participants indicated by the multi-source fusion sensing results, indicating that these traffic participants are within the sensing range of the first roadside device, and the first roadside device did not perceive
  • the traffic participants indicated by the multi-source fusion sensing result indicate that these traffic participants have exceeded the sensing range of the first roadside device. For example, when a pedestrian crosses the roadside greenery, the pedestrian does not report his location information through the mobile terminal, and because it is partially blocked by the greenery, vehicles from some angles are not recognized, but vehicles from other angles are recognized. Therefore, this pedestrian exists in the result of multi-source fusion perception.
  • the sensing range of the first roadside device can be determined conveniently and accurately.
  • FIG. 18A shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • the communication system includes a cloud server 11 , a first roadside device 12 , a car-side device 13 , a mobile terminal 14 and a second roadside device 15 .
  • the first roadside device 12 may represent any roadside device.
  • the second roadside device 15 may represent a roadside device other than the first roadside device 12 that establishes a communication connection with the cloud server 11 .
  • the second roadside device 15 may have established a communication connection with the first roadside device 12 , or may not have established a communication connection with the first roadside device 12 .
  • the roadside device among the second roadside devices 15 that has established a communication connection with the first roadside device 12 is called the third roadside device.
  • the first roadside device 12 , the second roadside device 15 , the vehicle-side device 13 and the mobile terminal 14 have respectively established communication connections with the cloud server 11 .
  • the vehicle end device 13 and the mobile terminal 14 have also established a communication connection with the first roadside device 12 respectively.
  • the first roadside device 12, the second roadside device 15, the vehicle-end device 13, and the mobile terminal 14 can communicate with the cloud server 11 respectively through a cellular network (for example, 3G, 4G or 5G, etc.) 11 Establish a communication connection.
  • a communication connection can also be established between the mobile terminal 14 and the first roadside device 12 through a cellular network.
  • a communication connection can be established between the vehicle-end device 13 and the first roadside device 12 through dedicated short-range communication (dedicated Short Range Communication, DSRC) technology and other vehicle-to-X (Vehicle to X, V2X) technologies.
  • DSRC dedicated Short Range Communication
  • V2X vehicle to X
  • the vehicle-end device 13 and the first roadside device 12 can establish a communication connection through an on-board unit (On Board Unit, OBU) and a communication device of the roadside device.
  • OBU on Board Unit
  • a communication connection can also be established between the first roadside device 12 and the second roadside device 15 through the V2X technology.
  • the mobile terminal 14 can obtain the terminal location data through the terminal positioning device, and then can report the terminal location data to the first roadside device 12 through the V2X network, and report the terminal location data to the cloud server 11 through the cellular network.
  • the vehicle end device 13 can obtain vehicle location data through a vehicle positioning device, and obtain vehicle perception data through a vehicle sensing device. Afterwards, the vehicle end device 13 can report the vehicle location data and vehicle perception data to the first roadside device 12 through the V2X network, and report the vehicle location data and vehicle perception data to the cloud server 11 through the cellular network.
  • the first roadside device 12 can obtain roadside perception data through the roadside sensing device, obtain terminal location data through the mobile terminal 14 , and obtain vehicle location data and vehicle perception data through the vehicle-end device 13 .
  • the terminal location data, vehicle location data and vehicle perception data may be referred to as roadside collected data of the first roadside device 12 .
  • the third roadside device can send the collected roadside data to the first roadside device 15.
  • the side device 12, at this time, the roadside collected data of the first roadside device 12 also includes the roadside collected data of the third roadside device.
  • the roadside collected data of the third roadside device can still be reported to the cloud server, thereby improving the reliability of the communication system.
  • the first roadside device 12 may report the roadside sensing data and roadside collection data to the cloud server through the cellular network.
  • the second roadside device 15 may also report the roadside sensing data and roadside collection data to the cloud server through the cellular network.
  • the manner in which the second roadside device 15 acquires the roadside sensing data and the roadside collected data can refer to the manner in which the first roadside device 12 acquires the roadside sensing data and the roadside collected data, and will not be repeated here.
  • the data received by the cloud server 11 includes: roadside sensing data from the first roadside device 12, roadside collected data from the first roadside device 12, roadside sensing data from the second roadside device 15, Roadside collected data from the second roadside device 15 , vehicle location data and vehicle perception data from the vehicle-end device 13 , and terminal location data from the mobile terminal 14 .
  • the cloud server 11 can obtain the roadside sensing result according to the roadside sensing data from the first roadside device 12, and obtain the multi-source fusion sensing result corresponding to the first roadside device according to the above received data.
  • the cloud server 11 can filter out the roadside sensing data within a preset time period from the roadside sensing data from the first roadside device 12 to obtain the roadside sensing result of the first roadside device; From the obtained data, the data within the preset time period and within the pre-selected range are screened out, and the screened data are fused to obtain the multi-source fusion sensing result of the first roadside device.
  • the preselected range is the area around the first roadside device
  • the preselected range can be determined according to the factory index of the sensing range of the first roadside device and the installation direction of the first roadside device, for example, it can be determined according to the sensing range of the first roadside device
  • reserve a certain margin in the installation direction for example, expand 3 meters, 5 meters, etc.
  • Filter out the data within the preset time period and within the pre-selected range for fusion which can reduce the amount of data for fusion and matching, thereby reducing the amount of calculation and improving efficiency. It is understandable that in the process of obtaining multi-source fusion sensing results, the more side sensing devices are involved, the more traffic participants are involved, or the longer the preset time period is, the better the multi-source fusion sensing results will be. precise.
  • the cloud server 11 can match the roadside sensing result with the multi-source fusion sensing result to obtain matching results of multiple target position points, and generate the first Information about the first perception capability of the road-side device. Afterwards, as shown in FIG. 18A, the cloud server 11 may issue the first perception capability information to the first roadside device 12, the vehicle-end device 13, the mobile terminal 14, the second roadside device 15, and the like. After receiving the first perception capability information, the first roadside device 12 may forward the first perception capability information to the third roadside device among the vehicle-end device 13 , the mobile terminal 14 and the second roadside device 15 .
  • FIG. 18B shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • the process of receiving data by the cloud server 11 may refer to the process of receiving data by the cloud server 11 in FIG. 18A , which will not be repeated here.
  • the data received by the cloud server 11 includes: roadside perception data from the first roadside device 12, roadside collected data from the first roadside device 12, roadside perception data from the second roadside device 15.
  • the data includes roadside collected data from the second roadside device 15 , vehicle location data and vehicle perception data from the vehicle-end device 13 , and terminal location data from the mobile terminal 14 .
  • the cloud server 11 can obtain the multi-source fusion sensing result corresponding to the first roadside device according to the above received data. Afterwards, the cloud server 11 may send the multi-source fusion sensing result corresponding to the first roadside device to the first roadside device 12 .
  • the first roadside device 12 may obtain a roadside sensing result according to its own roadside sensing data.
  • the first roadside device 12 can match the roadside sensing result and the multi-source fusion sensing result to obtain matching results of multiple target position points, and based on the matching result , to generate first perception capability information of the first roadside device. Afterwards, as shown in FIG. 18B , the first roadside device 12 may send the first perception capability information to the vehicle-end device 13 , the mobile terminal 14 and the third roadside device among the second roadside device 15 .
  • the process of matching the roadside sensing results with the multi-source fusion sensing results to obtain the matching results of multiple target locations, and generating the first sensing capability information of the first roadside device based on the matching results will be described in the embodiment of this application. The following sections describe in detail.
  • FIG. 18C shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • the communication system may include a first roadside device 12 , a vehicle-side device 13 , a mobile terminal 14 and a third roadside device 16 .
  • the vehicle end device 13 , the mobile terminal 14 and the third roadside device 16 have respectively established communication connections with the first roadside device 12 .
  • the vehicle end device 13 reports the vehicle location data and vehicle perception data to the first roadside device 12
  • the mobile terminal 14 reports the terminal location data to the first roadside device 12
  • the third roadside device 16 reports to the first roadside device 12.
  • the first roadside device 12 sends the roadside sensing data and roadside collection data of the third roadside device.
  • the data acquired in the first roadside device 12 includes: vehicle position data and vehicle perception data from the vehicle end device 13, terminal position data from the mobile terminal 14, roadside perception data and vehicle perception data from the third roadside device 16. Data collected on the roadside, as well as its own roadside perception data.
  • the first roadside device 12 may acquire a roadside sensing result based on its own roadside sensing data, and acquire a multi-source fusion sensing result based on the acquired data.
  • the way for the first roadside device 12 to acquire the roadside sensing result and the way to acquire the multi-source fusion sensing result can refer to the way the cloud server 11 in FIG. 18A acquires the roadside sensing result and the way to acquire the multi-source fusion sensing result, I won't go into details here.
  • the first roadside device 12 can match the roadside sensing result and the multi-source fusion sensing result to obtain matching results of multiple target position points, and based on the matching result , to generate first perception capability information of the first roadside device. Afterwards, as shown in FIG. 18B , the first roadside device 12 may send the first perception capability information to the vehicle-end device 13 , the mobile terminal 14 and the third roadside device among the second roadside device 15 .
  • the process of matching the roadside sensing results with the multi-source fusion sensing results to obtain the matching results of multiple target locations, and generating the first sensing capability information of the first roadside device based on the matching results will be described in the embodiment of this application. The following sections describe in detail.
  • the first roadside device can sense one or more traffic participants within a preset time period, and each sensed traffic participant corresponds to a group of location points, called the first group of location points. That is to say, the roadside perception result may indicate a first group of position points of each traffic participant among the one or more traffic participants sensed by the first roadside device within a preset time period. Specifically, the roadside perception result may include at least one item of time information, location information, motion parameters, and attribute information of each location point in the first group of location points indicated by the roadside perception result.
  • the location change information of the same traffic participant may be acquired by multiple sensing devices.
  • the position change information of the vehicle 1 can be acquired by its own vehicle-end equipment, sensed by surrounding roadside equipment, and sensed by vehicle-end equipment of other surrounding vehicles.
  • each sensing device that acquires the traffic participant’s location change information can obtain a set of location points of the traffic participant; all sensing the traffic participant’s location change
  • a group of location points corresponding to the traffic participant can be obtained, which is called the second group of location points.
  • Kalman filter, multi-Bayesian estimation method, fuzzy logic reasoning or artificial neural network can be used to fuse data acquired by multiple sensing devices.
  • the first group of location points of a traffic participant is a group of location points sensed by the first roadside device
  • the second group of location points of a traffic participant is obtained by the fusion of multiple groups of location points acquired by multiple sensing devices A set of location points.
  • the location points (including the first group of location points and the second group of location points) indicated by the roadside sensing result and the multi-source fusion sensing result are discrete location points.
  • the roadside perception result includes at least one item of time information, location information, motion parameters and attribute information of each location point in the first group of location points.
  • the multi-source fusion sensing result includes at least one item of time information, position information, motion parameters and attribute information of each position point in the second group of position points.
  • Matching the roadside sensing result with the multi-source fusion sensing result includes: matching the first set of position points with the second set of position points point by point.
  • point-by-point matching does not require a sequential relationship, which reduces the difficulty of obtaining roadside sensing results and multi-source fusion sensing results.
  • the location points (including the first group of location points and the second group of location points) indicated by the roadside sensing result and the multi-source fusion sensing result are the location points in the trajectory.
  • Fig. 19A shows a schematic diagram of the first group of position points and corresponding trajectories in the embodiment of the present application.
  • Fig. 19B shows a schematic diagram of the second group of position points and corresponding trajectories in the embodiment of the present application.
  • the roadside perception result includes a time series relationship among the position points in the first group of position points, and at least one item of time information, position information, motion parameters and attribute information of each position point in the first group of position points.
  • the multi-source fusion sensing result includes a time series relationship among the position points in the second group of position points, and at least one item of time information, position information, motion parameters and attribute information of each position point in the second group of position points.
  • Matching the roadside sensing result with the multi-source fusion sensing result includes: performing trajectory matching on the roadside sensing result and the multi-source fusion sensing result.
  • the trajectory matching algorithm may include but not limited to Hungarian Algorithm and K-means algorithm.
  • trajectory matching incorporates temporal relationships, which can improve the accuracy and confidence of matching results.
  • a target location point is a location point in the first group of location points or a location point in the second group of location points.
  • the matching result of a target location point is one of True Positive (TP), False Negative (FN) and False Positive (FP).
  • the matching result of a target position point being TP means that the target position point is a position point in the second group of position points, and there is a position point matching the target position point in the first group of position points.
  • the matching result of a target position point being FN means: the target position point is a position point in the second group of position points, and there is no position point matching the target position point in the first group of position points.
  • the matching result of a target position point being FP means that the target position point is a position point in the first group of position points, and there is no position point matching the target position point in the second group of position points.
  • FIG. 19C shows a schematic diagram of matching results in the embodiment of the present application.
  • k1, k2, and k3 are the trajectories corresponding to the roadside perception results, and the location points on k1, k2, and k3 are the location points in the first group of location points; h1, h2, and h3 are multi-source fusion perception The result corresponds to the track, and the position points on h1, h2 and h3 are the position points in the second group of position points.
  • trajectory matching it is found that h1 matches k1, h2 matches k2, there is no trajectory matching h3, and there is no trajectory matching k3.
  • the location points on h1 and h2 are the target location points and the matching result for TP.
  • the location point on h3 it belongs to the second group of location points and there is no matching location point in the first group of location points, therefore, the location point on h3 is the target location point and the matching result is FN.
  • the location point on k3 it belongs to the first group of location points and there is no matching location point in the second group of location points, so the location point on k3 is the target location point and the matching result is FP.
  • Fig. 19D shows a schematic diagram of trajectory matching in the embodiment of the present application.
  • k4, k5 and k6 are the trajectories corresponding to the roadside perception results, and the position points on k4, k5 and k6 are the position points in the first group of position points; h4, h5 and h6 are multi-source fusion perception The result corresponds to the track, the position points on h4, h5 and h6 are the position points in the second group of position points.
  • the combined trajectory of k4 and k5 will be matched with h4, so that the position point on h4 It is misjudged as the target position point whose matching result is TP. If the part of k4 from t+5 to t+7 and the part of K6 from t to t+5 are misjudged as a trajectory, the combined trajectory of k4 and k6 will be matched with h6, so that the position point on h6 It is misjudged as the target position point whose matching result is TP.
  • the roadside perception results and the multi-source fusion perception results include attribute information such as geometric shape size, color, etc., which can reduce the possibility of misjudgment of trajectories when the trajectories of different traffic participants meet, thereby improving the goal.
  • attribute information such as geometric shape size, color, etc.
  • the index information may include one or more of motion index error, shape size error, object tracking stability, and correct matching rate of location points.
  • the motion index error includes position error and/or speed error.
  • the position error may be dx/dy, wherein, dx represents the horizontal or longitude difference between the target position point and its matched first position point, and dy represents the vertical difference between the target position point and its matched first position point. The difference in vertical direction or latitude.
  • the speed error can be one or more of speed difference, speed ratio, acceleration difference and acceleration ratio.
  • the shape size error can be a difference in size or a ratio of size.
  • the target tracking stability indicates the deviation between the estimated position point and the collected position point, which can reflect the reliability of a group of position points. If the target tracking stability is higher, the reliability of this group of position points is higher, and the target tracking stability is higher. Low, the reliability of this group of location points is low.
  • methods such as Kalman filter, hidden Markov model or mean shift can be used to estimate the position point.
  • the correct matching rate of position points indicates the ratio of the number of position points whose matching result is TP in the second group of position points to the total number of position points in the second group of position points.
  • the associated tracking stability is the same, and the correct matching rate of the associated position points is also the same. It can be understood that the above is only an exemplary description of the index information, and the target location point whose matching result is TP may also be associated with other index information.
  • generating the first perception capability information of the first roadside device may include: determining a plurality of grids based on the preselected range of the first roadside device; The grid whose grid index meets the first condition in the grid is obtained, and the merged grid is obtained, and the grid whose grid index satisfies the first condition among the existing grids continues to be merged until there is no grid that satisfies the first condition.
  • Conditional grid for any grid, determine the grid as a perception area, and determine the perception ability level of the grid based on the index range to which the grid index of the grid belongs; Determine the perception capability information of the first roadside device based on the location information and perception capability level of the grid.
  • the preselected range of the first roadside device may be the area around the first roadside device, and the preselected range of the first roadside device may be determined according to the factory index of the sensing range of the first roadside device and the installation direction of the first roadside device. In an example, the preselected range of the first roadside device is greater than the range indicated by the factory index of the sensing range of the first roadside device in the installation direction.
  • determining the multiple grids based on the preselected range of the first roadside device may include: performing gridding processing on the preselected range of the first roadside device to obtain multiple grids.
  • determining the plurality of grids based on the preselected range of the first roadside device may include: taking the intersection of the preselected range of the first roadside device and the first road to obtain the area to be divided; Perform grid processing on the area to be divided to obtain multiple grids.
  • the first road may represent the road where the first roadside device is located or the road perceived by the first roadside device, and the association relationship between the first road and the first roadside device may be preset when the first roadside device is deployed.
  • FIG. 20A shows an exemplary schematic diagram of regions to be divided in the embodiment of the present application.
  • the area to be divided does not exceed the road edge line of the first road. In this way, the number of perceived traffic participants will not be reduced, and the subsequent grid division and fusion will be facilitated.
  • Fig. 20B shows an exemplary schematic diagram of the grid in the embodiment of the present application.
  • the region to be divided can be divided into multiple grids.
  • the area to be divided is evenly divided into multiple grids, which is convenient for statistical management.
  • other ways can also be used to divide the area to be divided into multiple grids. For example, the area of the grid divided in the area closer to the first roadside equipment is smaller than the area divided into the area farther away from the first roadside equipment. The area of the grid to be drawn.
  • the grid index of each grid can be determined.
  • the grid index of the grid may be determined according to the index information of the target position point of the grid.
  • the grid metrics include one or more of detection metrics, motion metrics, and tracking metrics, wherein the detection metrics include precision and/or recall, and the motion metrics include speed and/or acceleration, the tracking index includes the correct matching rate of position points and/or target tracking stability.
  • the first condition includes one or more of the following conditions: the difference between detection indicators is smaller than a first threshold; the difference between sports indicators is smaller than a second threshold; the difference between tracking indicators is smaller than a third threshold.
  • the first threshold, the second threshold and the third threshold can be set as required, for example, the first threshold can be 90%, the second threshold can be 1m/s, and the third threshold can be 95%. This embodiment of the present application does not limit the first threshold, the second threshold, and the third threshold.
  • FIG. 20C shows a graph of the merged grid in the embodiment of the present application. As shown in FIG. 20C , the divided grids are merged to obtain three regions, namely region 1 , region 2 and region 3 . Referring to Fig. 20C, in area 1, the proportion of target position points whose matching result is FN is relatively large, the proportion of target position points whose matching result is FP is very small, and the proportion of target position points whose matching result is TP is extremely small (even 0).
  • the first roadside device fails to perceive the traffic participants in area 1, and the first roadside device does not have the perception capability in area 1; in area 2, the proportion of target position points whose matching result is TP is small, The matching result shows that the proportion of the position points of FN and FP is relatively large. It can be seen that the first roadside device can perceive some traffic participants in area 2, and the first roadside device has perception ability in area 2, but the perception ability is relatively low. Poor; in area 3, the proportion of target location points whose matching result is TP is relatively large, and the proportion of target location points whose matching results are FN and FP is very small. It can be seen that the first roadside device has perception ability in area 3, and the ability to perceive strong.
  • the grid In the case that there is no grid satisfying the first condition, that is, when the grids cannot continue to be merged, for any grid, determine the grid as a sensing area, and based on the grid of the sensing area
  • the index range to which the index belongs determines the perception ability level of the sensing area; and determines the sensing ability information of the first roadside device according to the location information and the sensing ability level of each sensing area.
  • each index range corresponds to a perception ability level
  • determining the perception ability level of the perception area includes: the grid index in the perception area belongs to
  • the first index range it is determined that the perception ability level of the perception area is the first perception ability level.
  • the first index range is any one of the index ranges
  • the first perception ability level is the perception ability level corresponding to the first index range.
  • the grid index of the perception area belonging to the first index range may include: the detection index is within the first range, and/or the motion index is within the second range, and/or the tracking index is within the third range.
  • the first range, the second range, and the third range can be set as required, which are not limited in this embodiment of the present application.
  • the perception level may include: blind spot, low perception, average perception, and strong perception.
  • the levels of awareness may include: low, intermediate, and high.
  • the perception ability level may include: first level, second level, third level, fourth level and so on. It can be understood that, the above is only an exemplary description of the perception capability level, and the embodiment of the present application does not limit the division method and the number of divisions of the perception capability level.
  • the first perception capability information may be used to indicate the perception capability of the first roadside device.
  • the first perception capability information may indicate areas that the first roadside device can sense and areas that cannot be sensed.
  • the first roadside device can sense the area within 200 meters, but cannot sense the area beyond 200 meters.
  • the first perception capability information may be used to indicate the perception capabilities of the first area and the first roadside device in the first area.
  • the first area may represent any area.
  • the first area may be an area on the first road.
  • the first area may be a rectangle, a sector, or a polygon.
  • the embodiment of the present application does not limit the shape and area of the first region.
  • the first roadside device perceives the area within 100 meters better, that is, the perception ability is strong perception; the effect of sensing 100 meters to 150 meters is average, that is, the perception ability is medium perception; The area effect is poor, that is, the perception ability is weak perception; the perception is less than 200 meters away, that is, the perception ability is no perception.
  • the first perception capability information may be used to indicate the perception capabilities of the first scene, the first area, and the first roadside device in the first scene and in the first area.
  • the "scene" in the embodiment of the present invention is used to identify the environment in which the device with the perception function works, or to identify the environment in which the target perceived by the device with the perception function is located.
  • the first scene may represent any kind of scene.
  • the first scene includes, but is not limited to, daytime, nighttime, sunny days, cloudy days, sandstorms, rain and snow, foggy days, and other scenes that affect perception. It is understandable that the sensing range of the first roadside device is greater during the day than at night, and the sensing range on sunny days is greater than that on cloudy, windy, sandy, rainy, snowy, and foggy days.
  • the perception capability of the first roadside device may be described by scene, so that the accuracy of the perception capability of the first roadside device is higher.
  • the perception ability of the first roadside device in Area 2 shown in Figure 20C is medium perception
  • the perception ability in Area 3 shown in Figure 20C is strong perception
  • the perception ability of the first roadside device in area 2 shown in FIG. 20C is weak perception
  • the perception ability of the first roadside device in area 3 shown in FIG. 20C is medium perception.
  • the aforementioned roadside perception data Add scene tags to vehicle perception data, vehicle location data, and terminal location data, so that the roadside perception results and multi-source fusion perception results in the first scene can be obtained. If no scene label is added to the above-mentioned roadside perception data, vehicle perception data, vehicle location data, and terminal location data, before obtaining the roadside perception results and multi-source fusion perception results in the first scenario, third-party information (for example, combining time information and historical weather information) to obtain roadside perception data, vehicle perception data, vehicle location data, and terminal location data in the first scenario.
  • third-party information For example, combining time information and historical weather information
  • the second perception capability information of the second roadside device can refer to the first perception capability information of the first roadside device to obtain the For the manner of the second perception capability information, refer to the manner of acquiring the perception capability information of the first roadside device, which will not be repeated here.
  • the first perception capability information of the first roadside device may be associated with an identification of the road.
  • the perception capability information of each roadside device on a road or a section of road can be called out, so as to determine the roadside perception effect of each area on a road or a section of road, Helps to improve security.
  • the perception capability information of each roadside device can be integrated to form an overall perception coverage capability.
  • the method further includes: generating multiple pieces of perception capability information of multiple roadside devices; and generating perception blind zone information according to the multiple pieces of perception capability information.
  • the pieces of perception capability information are used to indicate the perception capabilities of the multiple roadside devices. Specifically, if the multiple roadside devices include the first roadside device, the multiple pieces of perception capability information include the first perception capability information. In addition, the plurality of roadside devices may further include one or more second roadside devices, and then the plurality of perception capability information includes one or more second perception capability information.
  • the perception blind area information is used to indicate an area not covered by one or more roadside devices among the plurality of roadside devices.
  • the area not covered by one or more roadside devices in the plurality of roadside devices includes: an absolute blind area and/or a relative blind area.
  • each roadside device in the plurality of roadside devices cannot reach the perception ability standard in the absolute blind zone, and some roadside devices in the plurality of roadside devices cannot reach the required sensing ability standard in the relative blind zone.
  • the standard of perceptual ability is used to indicate an area not covered by one or more roadside devices among the plurality of roadside devices.
  • the area not covered by one or more roadside devices in the plurality of roadside devices includes: an absolute blind area and/or a relative blind area.
  • the perception capability standard may be set as required, and this application does not limit the perception capability standard.
  • reaching the perception capability standard includes but is not limited to: conforming to a preset perception capability level (for example, the corresponding perception capability level is Level 1 or Level 2), or being within a preset index range (for example, the detection index falls within the preset within the preset index range, and/or the motion index falls within the preset index range, and/or the tracking index falls within the preset index range), etc.
  • a preset perception capability level for example, the corresponding perception capability level is Level 1 or Level 2
  • a preset index range for example, the detection index falls within the preset within the preset index range, and/or the motion index falls within the preset index range, and/or the tracking index falls within the preset index range
  • the roadside equipment fails to meet the perception capability standard in an area, it indicates that the roadside equipment has a poor perception effect in this area, and the confidence level of the information sensed in this area is low. Therefore, this area is the The blind
  • FIG. 21 shows an exemplary schematic diagram of a perception blind zone according to an embodiment of the present application.
  • FIG. 21 shows the boundary line between the perception blind zone and the non-perception blind zone of the roadside device 1 , and the boundary line between the perception blind zone and the non-perception blind zone of the roadside device 2 .
  • the area within the dividing line is a non-perceptual blind area, and the area outside the dividing line is a perceptual blind area.
  • the intersection of the perception blind zone of roadside device 1 and the non-perception blind zone of roadside device 2, and the intersection of the non-perception blind zone of roadside device 1 and the perception blind zone of roadside device 2 are relative perception blind spots.
  • the intersection of the perception blind zone of the roadside device 1 and the perception blind zone of the roadside device 2 is an absolute perception blind zone.
  • the perception capability of an area is based on the best perception capability of the roadside device 1 and the roadside device 2 .
  • the area can be determined as an absolute perception blind spot. In this case, the relative perceptual blind zone may not be marked.
  • the perception capability of the roadside equipment 1 does not meet the perception capability standard but the perception capability of the roadside device 2 can reach the perception capability standard
  • the area where the perception ability of roadside device 2 does not meet the perception ability standard but the area where the perception ability of roadside device 1 can meet the perception ability standard is determined as a relatively blind area of perception; the area where the perception ability of both does not meet the perception ability standard Determined as the absolute perceptual blind zone.
  • different identifiers may be added for the absolute perceptual blind zone and the relative perceptual blind zone. For example, a first identifier is added for an absolute perceptual blind zone, and a second identifier is added for a relative perceptual blind zone.
  • a perception blind spot is an absolute perception blind spot or a relative perception blind spot according to the identification.
  • the relative blind spot may also be associated with the identification of the roadside device, so as to clarify which roadside device a relative blind spot belongs to.
  • the perception capability information of one roadside device may be connected with the roadside device with which the roadside device has established a communication connection. In this way, the user can determine which roadside devices have established communication connections with the roadside devices, so as to determine where is the absolute perception blind spot and which is the relative perception blind spot.
  • the method further includes: generating warning prompt information according to the first perception capability information.
  • the early warning prompt information is used to prompt the driver to take over the vehicle in the second area, perform fault detection on the first roadside equipment, reduce the confidence of the information about the second area sensed by the first roadside equipment, or Avoid the second zone when planning.
  • the first perception capability information indicates that the perception capability of the first roadside device in the second area is lower than the perception threshold.
  • the perception threshold can be set as required. In an example, falling below the perception threshold may include, but is not limited to: not reaching the threshold perception ability level (for example, not reaching the first-level perception ability level or not reaching the second-level perception ability level, etc.), the detection index does not reach the preset One or more of the indicator threshold, the movement indicator not reaching the preset movement indicator threshold, and the tracking indicator not reaching the preset tracking indicator threshold are detected.
  • the detection index threshold, motion index threshold, and tracking index threshold here can be set as required, which is not limited in this embodiment of the present application.
  • the perception threshold can be greater than (higher than) or Equal to the perception ability standard.
  • the sensing ability of the first roadside device in the second area is lower than the sensing threshold, it means that the sensing effect of the first roadside device in the second area is poor, and the first roadside device cannot perceive the first roadside device accurately and comprehensively. Traffic participants in the second area. Therefore, the risk of autonomous driving of the vehicle in the second area is higher, and the driver can take over the vehicle in the second area.
  • fault detection can be performed on the first roadside equipment to check whether the perception effect of the first roadside equipment in the second area is poor due to the failure of the first roadside equipment, especially in the second area. When the first roadside equipment is relatively close.
  • the information about the second area perceived by the first roadside device includes: the location points of the traffic participants in the second area and the time information, location information, motion parameters and attribute information of each location point, etc. one or more of .
  • the second area can be avoided during path planning, which can reduce the possibility of accidents after the vehicle enters the second area, especially for automatic driving If the vehicle avoids driving in the second area, the driver does not need to take over the vehicle, which can effectively improve the user experience.
  • the roadside device may report parameters about coverage to the data processing device.
  • the data processing device generates the coverage information of the roadside equipment according to the coverage parameters reported by one or more roadside equipment.
  • the parameters about the coverage reported by the roadside device may be pre-configured, predefined or pre-designed in the roadside device, or may be obtained through actual detection.
  • the parameters about the coverage range or the coverage information may include information indicating the source of the coverage capability (for example, one of pre-design, actual measurement, estimation, etc.).
  • the roadside device may contain one or more sensing devices, or may be connected with one or more sensing devices.
  • the perception capability of the roadside equipment can be realized through the perception equipment.
  • sensing devices may be combined, and one or more sensing devices may form a sensing device group.
  • cameras and lidar can be used as a fusion perception device group to perform fusion perception combining image and laser detection.
  • the sensing coverage area in the coverage information may correspond to a sensing device or a sensing device group.
  • each sensing coverage area in the multiple sensing coverage areas may correspond to a sensing device, or each sensing coverage area may correspond to a sensing device group , or in the plurality of sensing coverage areas, part of the sensing coverage areas corresponds to sensing devices, and part of the sensing areas corresponds to sensing device groups.
  • the sensing coverage area corresponding to the sensing device group and the coverage capability in the sensing coverage area are determined according to the coverage capabilities of the sensing devices in the sensing device group.
  • the coverage capability of a sensing device group may be obtained by fusing the sensing capabilities of multiple sensing devices. Further, the fused coverage capabilities are divided into areas according to levels, so that the coverage area corresponding to the sensing device group can be obtained.
  • the sensing coverage area corresponding to a sensing device group can be called a multi-device sensing coverage area, and the multi-device sensing coverage area and the coverage capability of roadside devices in the multi-device sensing coverage area
  • the coverage capability of the device is determined.
  • the plurality of sensing devices are related to the roadside device, which means that each sensing device in the plurality of sensing devices is related to the roadside device; the sensing device is related to the roadside device, which means that the sensing device sends the roadside device the
  • the information perceived by the sensing device in terms of physical realization, includes but is not limited to that the sensing device is set in the roadside device, or is set outside the roadside device and is connected to the roadside device in a wireless or wired manner.
  • the coverage area of the roadside device is a communication area
  • the coverage capability information used to indicate the coverage capability of the roadside device in the at least one coverage area is communication capability information.
  • FIG. 22 shows a flowchart of a method for generating communication capability information provided by an embodiment of the present application. As shown in Figure 22, the method may include:
  • Step S2201 acquiring first communication status indication information.
  • Step S2202 Determine a first distribution of the plurality of location points around the first roadside device according to the first communication status indication information.
  • Step S2203 Generate first communication capability information of the first roadside device according to the first distribution situation.
  • the first roadside device represents the roadside device whose communication capability is to be determined.
  • the first roadside device may be any roadside device.
  • the first communication capability information may represent communication capability information of the first roadside device.
  • the first communication capability information may be used to indicate the communication capability of the first roadside device, for example, an area where the first roadside device can communicate and an area where the first roadside device cannot communicate.
  • the first communication state indication information may be used to instruct multiple terminal devices to establish communication connections with the first roadside device at multiple locations.
  • a terminal device When a terminal device establishes a communication connection with the first roadside device at a location point, it indicates that the location point is within the communication range of the first roadside device, and the communication capability of the first roadside device can reach the location point . Therefore, based on the distribution of the multiple location points of the multiple terminal devices that establish communication connections with the first roadside device, the area that the communication capability of the first roadside device can reach can be determined, thereby obtaining the first roadside device conveniently and accurately. Communication range of roadside equipment.
  • the multiple location points of the multiple terminal devices indicated by the first communication state indication information may include: the location points of different terminal devices at the same time, the location points of the same terminal device at different times, and different The location of the terminal device at different times.
  • the multiple location points of multiple terminal devices may include: location point 1 of vehicle 1 at 1:00 am on Monday and location point 2 of vehicle 2 at 1:00 am on Monday, and location points 1 and 2 of vehicle 1 at 1:00 am on Monday. Vehicle 1 is at position point 3 at 1:00 pm on Monday, and vehicle 3 is at position point 4 at 1:00 am on Tuesday and vehicle 4 is at position point 5 at 1:00 pm on Tuesday. That is to say, the embodiment of the present application does not limit whether the multiple location points indicated by the first communication state indication information are the location points of the same terminal device and whether they are the location points collected at the same time.
  • the first communication status indication information may include: position information of multiple indicated location points, working status information of multiple communication modules in multiple terminal devices indicated, multiple indicated The connection status information between a terminal device and the first roadside device, the identification information of the first roadside device, and time information.
  • position information of multiple indicated location points may include: position information of multiple indicated location points, working status information of multiple communication modules in multiple terminal devices indicated, multiple indicated The connection status information between a terminal device and the first roadside device, the identification information of the first roadside device, and time information.
  • the location information of the location point, the working status information and the time information of the communication module are as described above, and will not be repeated here.
  • the connection state information between a terminal device and a roadside device may be in a connected state or an unconnected state.
  • the connected state indicates that the terminal device has established a communication connection with the roadside device
  • the unconnected state indicates that the terminal device has not established a communication connection with the roadside device. Since the first communication status indication information indicates that a plurality of terminal equipments establish communication connections with the first roadside equipment at multiple locations, therefore, the connection status information with the first roadside equipment in the first communication status indication information is Connection Status.
  • the identification information of the roadside equipment can be used to identify different roadside equipment.
  • the identification information of the roadside equipment may be the name, number, location information of the roadside equipment, the identification of the communication module configured on the roadside equipment, or other user-defined identifications.
  • the identification information of the first roadside device may be the name and serial number of the first roadside device, the RSU_ID of the first roadside device, or other user-defined identifiers for the first roadside device.
  • FIG. 23 shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • the communication system includes a first roadside device 11 and a first terminal device 12 .
  • the first roadside device 11 may represent any roadside device
  • the first terminal device 12 represents a terminal device establishing a communication connection with the first roadside device 11 .
  • the first terminal device 12 includes, but is not limited to, a vehicle end device, a mobile terminal and other devices.
  • the first roadside device 11 may be connected to one or more first terminal devices 12 .
  • the first roadside device 11 may establish a communication connection with the first terminal device 12 through a communication module in the first terminal device 12 . After the first terminal device 12 acquires its own traffic participant data, it may report the acquired traffic participant data to the first roadside device 11 .
  • the traffic participant data of a terminal device may include the location information of the location point of the terminal device when the traffic participant data is collected, and the time information of the traffic participant data collection.
  • the working status information of the communication module, and the identification information of the roadside equipment connected to the terminal equipment can be recorded as Position
  • the location information can be recorded as Position
  • the working status information can be recorded as Connection
  • the identification information of the roadside equipment can be recorded as RSU_ID
  • the time information can be recorded as Time
  • the traffic participant data of a terminal device can be recorded as (Position, Device, Connection, RSU_ID, Time).
  • the first terminal device 12 is a terminal device that establishes a communication connection with the first roadside device 11, in the traffic participant data of the first terminal device 12, the working state information of the communication module is "normal working state", and the roadside
  • the device identification information includes "identification information of the first roadside device 11".
  • the first roadside device 11 may generate first communication status indication information based on the received information.
  • the first terminal device 12 that establishes a communication connection with the first roadside device 11 can directly report the traffic participant data to the first roadside device 11, and other terminal devices that have not communicated with the first roadside device 11 11
  • the device on the terminal that establishes a communication connection cannot directly report its traffic participant data to the first roadside device 11 (this does not consider the case of forwarding through other roadside devices, even if the first roadside device receives other roadside devices.
  • the traffic participant data that has established a communication connection with the first roadside device 12 can still be screened out based on the identification information of the roadside device therein). Therefore, the traffic participant data collected by the first roadside device 11 are all from the first terminal device 12 establishing a communication connection with the first roadside device 11 .
  • the first roadside device may execute step S2202 to obtain the first distribution situation.
  • the first roadside device may determine the first distribution situation according to the location information of the first location point (that is, the location point of the first terminal device indicated by the first communication state indication information).
  • Fig. 24 shows an exemplary schematic diagram of the first distribution situation.
  • the first terminal device that is, the terminal device that has established a communication connection with the first roadside device
  • the location information of these locations is the first One distribution.
  • the first roadside device 11 may execute step S2203 to obtain the first communication capability information.
  • FIG. 25 shows a schematic structural diagram of a communication system provided by an embodiment of the present application.
  • the communication system includes: a first roadside device 11 , a second terminal device 13 and a server 14 .
  • the first roadside device 11 may be any roadside device.
  • the second terminal 13 may represent a terminal that establishes a communication connection with the server 14 . Both the first roadside device 11 and the second terminal device 13 can establish a communication connection with the server 14 through the cellular network. After the second terminal device 13 acquires its own traffic participant data, it may report the acquired traffic participant data to the server 14 .
  • the second terminal device 13 may include the first terminal device 12 that establishes a communication connection with the first roadside device 11, that is, some second terminal devices 13 may not only establish a communication connection with the server 14, but also communicate with the first roadside device 11.
  • Device 11 establishes a communication connection. Therefore, after the server 14 receives the traffic participant data reported by each second terminal device 13, it can filter out the roadside device established with the first roadside device based on the working status information in each traffic participant data and the identification information of the roadside device. Traffic participant data for communication links. Specifically, the server 14 can filter out from the received traffic participant data that the working status information of the communication module is "normal working status", and the identification information of the roadside equipment includes "identification information of the first roadside equipment 11". Traffic participant data, and based on the filtered traffic participant data, generate first communication state indication information.
  • the server 14 may execute step S2202 to obtain the first distribution situation, or send the first communication status indication information to the first roadside device 11, and the first The roadside equipment 11 executes step S2202 to obtain the first distribution situation.
  • the server 14 may first find the first communication status in the collected traffic participant data in the process of screening the traffic participant data.
  • the traffic participant data within the pre-selected range of the roadside equipment, and then filter out the traffic participant data whose working status information is "normal working status" from the traffic participant data within the pre-selected range.
  • the data set composed of the traffic participant data screened out at this time is called data set A.
  • the server 14 can filter out the traffic participant data including the "identification information of the first roadside device 11" in the identification information of the roadside equipment from the data set A.
  • the data set composed of the former data is called data set B.
  • the data set composed of traffic participant data in data set A except the traffic participant data in data set B is called data set C.
  • the preselected range is the area around the first roadside device 11, and the preselected range can be determined according to the factory index of the communication range of the first roadside device 11 and the installation direction of the first roadside device, for example, it can be determined according to the location of the first roadside device 11.
  • reserve a certain margin in the installation direction for example, expand 3 meters, 5 meters, etc.
  • FIG. 26 shows a schematic diagram of distribution of terminal devices. As shown in Figure 26, within the preselected range, a plurality of location points of the terminal equipment are shown. At some location points, the terminal equipment can establish a communication connection with the first roadside device. At some location points, the terminal equipment The device cannot establish a communication connection with the first roadside device. The traffic participant data corresponding to the location points that can establish a communication connection with the first roadside device are in the data set B, and the traffic participant data corresponding to the location points that have not established a communication connection with the first roadside device are in the data set C middle. The location information of the location points in the data set B shown in FIG. 26 is the first distribution situation.
  • the server 14 or the first roadside device 11 may obtain the first communication capability information in step S2203.
  • step S2203 can be executed by the first roadside device or by the server.
  • the following will take the first roadside device to execute step S2203 as an example.
  • the process of executing step S2203 by the server can be referred to by the first roadside device.
  • the road-side device executes the process of step S2203, which will not be repeated in this embodiment of the present application.
  • step S2203 may include: the first roadside device directly generates the first communication capability information according to the first distribution situation.
  • the first distribution condition may be the density of the first location point, where the first location point represents the location point of the first terminal device. In the area where the density of the first location point is high, the communication capability of the first roadside device is strong, and in the area where the density of the first location point is small, the communication capability of the first roadside device is weak.
  • the side device may generate the first communication capability information according to the density of the first location point.
  • the first roadside device may obtain the second communication state indication information, determine the second distribution according to the second state indication information, and then in step S2203 according to the first distribution and the second distribution case, generate first communication capability information.
  • the second communication state indication information is used to indicate that at least one terminal device (for convenience of description, in this embodiment, at least one terminal device is referred to as at least one third terminal device) is at least one location point (for convenience of description) , in this embodiment of the application, at least one location point is referred to as at least one third location point) establishes a communication connection with the second roadside device, and the distance between at least one third location point and the first roadside device is less than the preset distance threshold.
  • the acquisition process of the second communication state indication information can refer to the acquisition process of the first communication state indication information, the first roadside device in the first communication state indication information acquisition process is replaced by the second roadside device, and the traffic participant
  • the location information in the information may be limited to a distance from the first roadside device that is less than a preset threshold.
  • the preset threshold can be set according to needs. In terms of distance, the preset threshold can be 100 meters, 200 meters, 500 meters or 1000 meters.
  • the first roadside device may determine the second distribution situation according to the location information of the third location point (that is, the location point indicated by the second communication status indication information).
  • the second distribution may refer to the position information of the position points in the data set B in FIG. 26 plus the position information of the position points in the data set C.
  • a terminal device has established a communication connection with the second roadside device at a point where the distance from the first roadside device is less than the preset threshold, indicating that the working status information of the communication module of the terminal device is "normal working status" , and the terminal device is located around the first roadside device.
  • the terminal device is the above-mentioned third terminal device
  • the location point is the above-mentioned third location point.
  • a third terminal device may have established a communication connection with the first roadside device at a third location point (for example, the location point in the data set B shown in Figure 26), or may not have established a communication connection with the first roadside device communication connection (for example, the location points in dataset C shown in FIG. 26).
  • the second distribution situation can be used as the comparison object of the first distribution situation, and the second distribution situation can reflect the location points that actually exist around the first roadside device and can establish a communication connection with the first roadside device , using the first distribution to reflect the position point where the first roadside device actually establishes the communication connection.
  • the stable connection rate may be determined based on the first distribution situation and the second distribution situation. Wherein, the stable connection rate may be a ratio of the number of first location points to the number of third location points.
  • the first roadside device can generate the first communication capability information according to the stable connection rate.
  • the first roadside device may obtain the third communication state indication information, and then determine the third distribution situation according to the third communication state indication information, and then in step S2203, according to the first distribution situation and the second In the case of three distributions, the first communication capability information is generated.
  • the third communication status indication information is used to indicate that at least one terminal device (for convenience of description, in this embodiment, at least one terminal device is referred to as at least one second terminal device) is at least one location point (for convenience of description) , in this embodiment of the application, at least one location point is referred to as at least one second location point) to establish a communication connection with the server, and at least one second terminal device has the ability to connect to the first roadside device, and at least one second location point The distance between the point and the first roadside device is less than a preset threshold.
  • the acquisition process of the third communication status indication information may be obtained by the server through screening from the received traffic participant information as shown in FIG. 25 .
  • the server may filter traffic participant data whose location information is less than a preset threshold from the first roadside device and whose working status information of the communication module is "normal working status" from the received traffic participant information, Then the third communication state indication information is obtained based on the filtered traffic participant data.
  • the first roadside device may determine the third distribution situation according to the location information of the second location point (that is, the location point indicated by the third communication status indication information).
  • the third distribution may refer to the position information of the position points in the data set B in FIG. 26 plus the position information of the position points in the data set C.
  • a terminal device has established a communication connection with the server at a point whose distance from the first roadside device is less than a preset threshold, and the working status information of the communication module of the terminal device is "normal working status", indicating that the terminal device It is near the first roadside device and has the ability to connect to the first roadside device.
  • the terminal device is the above-mentioned second terminal device, and the location point is the above-mentioned second location point.
  • the third distribution situation can be used as the comparison object of the first distribution situation, and the third distribution situation can reflect the actual existence around the first roadside device that can establish a communication connection with the first roadside device.
  • the position points reflect the position points at which the first roadside device actually establishes a communication connection with the first distribution situation.
  • the stable connection rate may be determined based on the first distribution situation and the third distribution situation.
  • the stable point connection rate may be a ratio of the number of first location points to the number of second location points. It can be understood that when the stable connection rate is large, the number of location points indicating that the first roadside device has actually established a communication connection is different from the number of location points that actually exist around the first roadside device that can establish a communication connection with the first roadside device. The number of location points is relatively close, and the communication capability of the first roadside device is relatively good.
  • the first roadside device can generate the first communication capability information according to the stable connection rate.
  • step S2203 may include: determining a plurality of grids based on the preselected range of the first roadside device; merging the grids whose grid indicators satisfy the first condition among the plurality of grids, Obtain the merged grid, and continue to merge the grids whose grid index meets the first condition in the existing grids, until there is no grid that meets the first condition; for any grid, determine the grid as A communication area, and based on the index range to which the grid index belongs, determine the communication capability level of the grid; according to the location information and communication capability level of each grid, determine the first communication capability information.
  • determining the multiple grids based on the preselected range of the first roadside device may include: performing gridding processing on the preselected range of the first roadside device to obtain multiple grids.
  • determining a plurality of grids based on the preselected range of the first roadside device may include: taking the intersection of the preselected range of the first roadside device and the first road to obtain the area to be divided; Grid processing to obtain multiple grids.
  • the first road may represent the road where the first roadside device is located or the roads around the first roadside device, and the relationship between the first road and the first roadside device may be preset when the first roadside device is deployed.
  • the grid index is: the density or the stable connection rate of the first position point in the grid
  • the corresponding first condition is: the density difference is smaller than the first threshold or the stable connection rate difference is smaller than the second threshold.
  • the first threshold and the second threshold can be set as required, for example, the first threshold can be 0.2 pieces/m2, etc., and the second threshold can be 0.1, etc. The embodiment of the present application does not limit the first threshold and the second threshold.
  • Fig. 27 shows an exemplary schematic diagram of the grid in the embodiment of the present application.
  • the preselected range of the first roadside device is divided into multiple grids.
  • the region to be divided is evenly divided into multiple grids (as shown in FIG. 27 ), which is convenient for statistical management.
  • other ways can also be used to divide the area to be divided into multiple grids. For example, the area of the grid divided in the area closer to the first roadside equipment is smaller than the area of the grid divided in the area farther away from the first roadside equipment. The area of the grid (not shown). In this way, the number of calculations and the number of merges can be reduced.
  • the density of the first position point of each grid can be determined as the grid index of each grid.
  • the grids whose grid indices meet the first condition among the multiple grids can be merged to obtain the merged grid.
  • Fig. 28 shows an exemplary schematic diagram of the grid merging result in the embodiment of the present application. As shown in Figure 28, the grids shown in Figure 27 are finally merged to obtain Region 1 and Region 2. Among them, the density of the first location point in area 1 is small, and the density of the first location point in area 2 is relatively large. It can be seen that the first roadside device has communication capability in area 1, but the communication ability is poor. Medium has communication ability, and communication ability is strong.
  • Fig. 29 shows an exemplary schematic diagram of a grid in an embodiment of the present application.
  • the preselected range of the first roadside device is divided into multiple grids.
  • the stable connection rate of each grid can be determined as the grid index of each grid.
  • the grids whose grid indices meet the first condition among the multiple grids can be merged to obtain the merged grid.
  • determine the grid index of each grid obtained after the last round of merging and continue to merge the existing grids whose grid index satisfies the first condition until there is no grid that satisfies the first condition .
  • FIG. 30 shows an exemplary schematic diagram of grid merging results in the embodiment of the present application.
  • the grids shown in Figure 29 are finally merged to obtain Region 1 and Region 2.
  • the stable connection rate in area 1 is small, and the stable connection rate in area 2 is relatively large.
  • the first roadside device has communication capabilities in area 1, but the communication ability is poor, and it has communication capabilities in area 2, and the communication Strong ability.
  • the grid determines the grid as a communication area, and based on the grid of the communication area
  • the index range to which the index belongs determines the communication capability level of the communication area; according to the location information and communication capability level of each communication area, the communication capability information of the first roadside device can be determined.
  • each index range corresponds to a communication capability level
  • determining the perception capability level of the communication area based on the index range to which the grid index of the communication area belongs includes: the grid index in the communication area belongs to the first In the case of the index range, it is determined that the perception capability category of the communication area is the first perception capability level.
  • the first index range is any one of the index ranges
  • the first communication capability level is the communication capability level corresponding to the first index range.
  • Area 1 and Area 2 there are two communication areas: Area 1 and Area 2, wherein, the grid index of Area 1 belongs to Index Range 1, and the grid index of Area 2 belongs to Index Range 2, then the first The communication capability level of roadside equipment in area 1 is level 1, and the communication capability level in area 2 is level 2.
  • the grid index of the communication area belonging to the first index range may include: the density is within the first range, and/or the stable connection rate is within the second range.
  • the first range and the second range can be set as required, and are not limited in this embodiment of the present application.
  • the communication capability level may include: communication blind zone, weak communication capability, general communication capability and strong communication capability.
  • the communication capability levels may include: low level, medium level and high level.
  • the communication capability level may include: first level, second level, third level, fourth level and so on. It can be understood that, the above is only an exemplary description of communication capability levels, and this embodiment of the present application does not limit the division manner and number of communication capability levels.
  • the first communication capability information may be used to indicate the communication capability of the first roadside device.
  • the first communication capability information may indicate areas where the first roadside device can communicate and areas where it cannot communicate.
  • the first roadside device can communicate with terminal devices in an area within 200 meters, but cannot communicate with terminal devices in an area beyond 200 meters.
  • the first communication capability information may be used to indicate the communication capabilities of the first area and the first roadside device in the first area.
  • the first area may represent any area.
  • the first area may be a first area on the first road.
  • the first area can be rectangular, fan-shaped, elliptical or other shapes.
  • the embodiment of the present application does not limit the shape and area of the first region.
  • the communication effect of the first roadside device is better in the area within 100 meters, that is, the communication capability is strong communication capability; the communication effect between 100 meters and 150 meters is average, that is, the communication capability is medium communication capability;
  • the communication effect in the area from 200 meters to 200 meters is poor, that is, the communication capability is weak communication capability; it cannot communicate with the area beyond 200 meters, that is, the communication capability is unable to communicate.
  • the first communication capability information may be used to indicate the communication capabilities of the first scenario, the first area, and the first roadside device in the first area in the first scenario.
  • the "scene" in the embodiment of the present application is used to identify the environment where the device with communication function is located (for example, the environment where the first roadside device is located), or to identify the environment where the communication object of the device with communication function is located ( For example, the environment of vehicles or pedestrians).
  • the first scene may represent any kind of scene.
  • the first scene includes, but is not limited to, daytime, nighttime, sunny days, cloudy days, sandstorms, rain and snow, foggy days, and other scenes that affect perception. It can be understood that the communication range of the first roadside device on sunny days is greater than that on cloudy, sandy, rainy and foggy days.
  • the communication range of the first roadside device is also different.
  • the communication range may be smaller during the day when the traffic flow is heavy, and the communication range may be larger at night when the traffic flow is light. Therefore, in the embodiment of the present application, the communication capability of the first roadside device may be described by scenario, so that the accuracy of the communication capability of the first roadside device is higher. For example, in a sunny scenario, the communication capability of the first roadside device in Area 1 shown in Figure 30 is medium communication, and the communication capability in Area 2 shown in Figure 30 is strong communication; , the communication capability of the first roadside device in Area 1 shown in FIG. 30 is weak communication, and the communication capability of the first roadside device in Area 2 shown in FIG. 30 is medium communication.
  • first communication capability information indicates the communication capabilities of the first scene, the first area, and the first roadside device in the first area in the first scene
  • scene tags can be added to the aforementioned traffic participant data , so that the first communication state indication information, the second communication state indication information and the third communication state indication information in the first scenario can be acquired. If no scene label is added to the above traffic participant data, before obtaining the traffic participant data in the first scene, the traffic participant in the first scene can be obtained by combining third-party information (for example, combining time information and historical weather information) data.
  • third-party information for example, combining time information and historical weather information
  • the first communication capability information of the first roadside device is obtained.
  • the manner of obtaining the communication capability information of other roadside devices refer to the manner of obtaining the first communication capability information of the first roadside device, which will not be repeated here.
  • the manner of obtaining the second communication capability information of the second roadside device may refer to the manner of obtaining the first communication capability information of the first roadside device.
  • the first communication capability information of the first roadside device may be associated with the identification of the road.
  • the communication capability information of each roadside device on a road or a section of road can be called out, so as to determine the roadside communication capabilities of each area on a road or a section of road , which is conducive to improving security.
  • the first communication capability information may be stored as map data.
  • the vehicle when the vehicle is driving intelligently, it can obtain the first communication capability information from the map, so as to determine whether the driver needs to take over the vehicle when the vehicle travels to a certain area, and whether it is necessary to reduce the traffic from the first roadside in a certain area. The confidence level of the device's information, or whether a certain area needs to be avoided when planning a route, thereby improving safety.
  • the first communication capability information may be associated with the first roadside device and stored as map data.
  • the communication capability information of other roadside devices (for example, the second communication capability information of the second roadside device) may also be stored as map data to improve safety.
  • the communication capability information of multiple roadside devices can be integrated to form an overall communication coverage capability.
  • the method further includes: generating multiple pieces of communication capability information of multiple roadside devices; and generating communication blind zone information according to the multiple pieces of communication capability information.
  • the pieces of communication capability information are used to indicate the communication capabilities of multiple roadside devices.
  • the plurality of roadside devices include the first roadside device, and the plurality of communication capability information includes the first communication capability information.
  • the plurality of roadside devices may further include one or more second roadside devices, and then the plurality of pieces of communication capability information may include one or more pieces of second communication capability information.
  • the communication blind area information is used to indicate an area not covered by one or more roadside devices among the plurality of roadside devices.
  • the area not covered by one or more roadside devices in the plurality of roadside devices includes: an absolute blind area and/or a relative blind area, wherein any roadside device in the plurality of roadside devices is within None of the roadside devices in the absolute blind zone can reach the threshold T1, and some of the roadside devices in the plurality of roadside devices cannot reach the threshold T2 in the relative blind zone.
  • the threshold T1 and the threshold T2 may be set as required, and the embodiment of the present application does not limit the threshold T1 and the threshold T2.
  • Threshold T1 and threshold T2 may be used to indicate expected or acceptable communication effects. When a roadside device fails to reach the threshold T1 or threshold T2, it indicates that the communication effect of the roadside device does not meet expectations or is unacceptable. When a roadside device can reach the threshold T1 or threshold T2, it indicates that the communication effect of the roadside device can meet expectations or is acceptable.
  • the thresholds T1 and T2 include, but are not limited to: conforming to a preset communication capability level (for example, the corresponding communication capability level is Level 1 or Level 2), or within a preset index range (for example, the density falls within a preset within the range of the target, the stable connection rate falls within the range of the preset target), etc.
  • a roadside device does not reach the threshold T1 in an area, it indicates that the communication effect of the roadside device in this area is poor, and the reliability and accuracy of the information obtained by communication of the roadside device in this area are low ( low confidence, not complete), therefore, this area is the blind area of the roadside equipment.
  • the threshold T1 and the threshold T2 may be the same or different, which is not limited.
  • FIG. 31 shows an exemplary schematic diagram of a communication dead zone according to an embodiment of the present application.
  • FIG. 31 shows the boundary line between the communication blind area and the non-communication blind area of the roadside device 1 , and the boundary line between the communication blind area and the non-communication blind area of the roadside device 2 .
  • the area within the dividing line is a non-communication blind area, and the area outside the dividing line is a communication blind area.
  • the intersection of the communication blind area of roadside equipment 1 and the non-communication blind area of roadside equipment 2, and the intersection of the non-communication blind area of roadside equipment 1 and the communication blind area of roadside equipment 2 are relative communication blind areas.
  • the intersection of the communication blind area of roadside equipment 1 and the communication blind area of roadside equipment 2 is an absolute communication blind area.
  • the best of roadside equipment 1 and roadside equipment 2 Communication ability shall prevail.
  • the area if neither the communication capability of roadside equipment 1 nor the communication capability of roadside equipment 2 reaches the threshold T1, it can be determined that the area is an absolute communication dead zone. In this case, the relative communication dead zone may not be marked.
  • the communication capability of the roadside equipment 1 when no communication connection is established between the roadside equipment 1 and the roadside equipment 2, the communication capability of the roadside equipment 1 does not reach the threshold T1 but the communication capability of the roadside equipment 2 reaches The area where the communication capability of the roadside device 2 does not reach the threshold T1 but the communication capability of the roadside device 1 reaches the threshold T1 is determined as a relative communication blind area; the area where the communication capabilities of both of them do not reach the threshold T1 Determined as an absolute communication dead zone.
  • different identifiers may be added for the absolute communication blind zone and the relative communication blind zone. For example, a first identifier is added for an absolute communication blind zone, and a second identifier is added for a relative communication blind zone. In this way, it can be determined whether a communication blind area is an absolute communication blind area or a relative communication blind area according to the identification.
  • the relative communication blind spot may also be associated with the identification of the roadside equipment, so as to specify which roadside device a relative communication blind spot belongs to.
  • the communication capability information of the roadside device may be associated with the roadside device with which the roadside device has established a communication connection. In this way, the user can determine which roadside equipment has established communication connections with the roadside equipment by himself, so as to determine where is an absolute communication blind spot and which is a relative communication blind spot.
  • the method further includes: generating early warning prompt information according to the first communication capability information.
  • the early warning prompt information can be used to prompt the driver to take over the vehicle in the second area, perform fault detection on the first roadside equipment, update the software of the first roadside equipment, or adjust the first roadside equipment. Deployment of roadside equipment, reducing the confidence level of information from the first roadside equipment in the second area, or avoiding the second area when planning routes, wherein the first communication capability information indicates that the first roadside equipment The communication capability of the road-side device in the second area is lower than the first threshold.
  • the first communication capability information indicates that the communication capability of the first roadside device in the second area is lower than the first threshold.
  • the first threshold can be set as required. In an example, being lower than the first threshold may include but is not limited to: failing to reach a preset communication capability level (for example, not reaching a first-level communication capability level or not reaching a second-level communication capability level, etc.), the density of the first location point One or more of the preset density threshold and the stable connection rate not reaching the preset stability threshold are not reached.
  • the density threshold and the stability threshold can be set as required, and are not limited in this embodiment of the present application.
  • the first threshold is used for early warning, and early warning is required in non-communication dead spots but poor communication areas. Therefore, in an example, the first threshold can be greater than (higher than) or equal to threshold T1 and threshold T2.
  • the communication capability of the first roadside device in the second area is lower than the first threshold, it means that the communication effect of the first roadside device in the second area is poor, and the first roadside device cannot communicate with the first roadside device accurately and comprehensively.
  • the terminal devices in the second area communicate, therefore, there is no guarantee that the first roadside device can transfer the information it obtains (including information sensed by itself and information collected from other devices) to every terminal device in the second area. a terminal device. Therefore, when the vehicle is driving automatically in the second area, there may not be enough data sources, and the risk is high, and the driver can take over the vehicle in the second area.
  • the failure of the first roadside equipment can check whether the communication effect of the first roadside equipment in the second area is poor due to the failure of the first roadside equipment, especially when the second area is far away from the first roadside equipment.
  • the equipment on the road side is close.
  • the communication effect of the first roadside device in the second area is poor, the information of the terminal devices in the second area collected by the first roadside device cannot better represent the actual situation in the second area, so , the confidence of the information obtained by the first roadside device needs to be reduced in the second area.
  • the second area can be avoided during path planning, which can reduce the possibility of accidents after the vehicle enters the second area, especially for automatic driving If the vehicle avoids driving in the second area, the driver does not need to take over the vehicle, which can effectively improve the user experience.
  • FIG. 8A is a schematic diagram of a possible scenario where this embodiment of the present application is applicable
  • FIG. 8B is a schematic diagram of a possible coverage area provided by this embodiment of the present application.
  • the sensing device 801 and the sensing device 802 belong to the same sensing device group and can sense road conditions.
  • the sensing coverage area of the sensing device 801 and the sensing coverage area of the sensing device 802 are shown in FIG. 8B .
  • Table 4 is a possible coverage information provided by the embodiment of the present application.
  • the coverage information shown in Table 4 is used to exemplarily describe the coverage areas shown in FIG. 8A and FIG. 8B .
  • the coverage capability corresponding to sensing device group 1 is obtained by fusing the coverage capabilities of sensing device 1 and sensing device 2; the coverage area of sensing device group 1 is obtained according to the fused coverage capability.
  • the sensing device group 1 includes multiple coverage areas, the multiple coverage areas are divided into levels according to the fused coverage capabilities.
  • the coverage area in the coverage information may be obtained by fusing the coverage areas of multiple devices.
  • the coverage capability information in the coverage information may also be obtained by fusing coverage capabilities of multiple devices.
  • the coverage area 6 of the sensing device group 1 as an example, the coverage area 6 may be obtained by fusing the coverage area 7 of the sensing device 1 and the coverage area 8 of the sensing device 2 .
  • the fusion can be understood as: the coverage area 6 is obtained by overlapping the coverage area 7 and the coverage area 8 .
  • fusion can also be performed through fitting, reinforcement learning models, deep learning models, or preset calculation methods, and this application is also applicable to the above-mentioned method of fusing perception areas.
  • the coverage capability information of the coverage area 6 of the sensing device group 1 may be determined according to the coverage capability information of the sensing device 1 and the coverage capability information of the sensing device 2 .
  • the coverage capability of the sensing device group 1 in the coverage area 6 is obtained by fusing the coverage capabilities of the sensing device 1 and the sensing device 2 .
  • the fusion of coverage capability information can also be carried out by fitting, reinforcement learning model, deep learning model, or preset calculation methods, and this application is also applicable to the above-mentioned multiple fusion methods.
  • the coverage information when the coverage information includes multiple coverage areas, there may be overlapping areas among the multiple coverage areas.
  • the coverage information further includes information about multiple coverage capabilities corresponding to the multiple coverage areas. For example, referring to FIG. 8B , there may be an overlapping area between the coverage area 7 of the sensing device 801 and the coverage area 8 of the sensing device 802 .
  • Table 5 is another possible coverage information provided by the embodiment of the present application. The coverage information shown in Table 5 is used to exemplarily describe the coverage areas shown in FIG. 8A and FIG. 8B .
  • the coverage information shown in Table 5 may include the coverage area 7 of the sensing device 801 and the coverage capability information of the corresponding coverage area 7 (for example, the corresponding coverage capability information may be that the accuracy of the sensing result is >98% and the recall rate > 94%), may also include information on the coverage area 8 of the sensing device 802 and the coverage capability of the corresponding coverage area 8 (for example, the corresponding coverage capability information may be that the correct rate of sensing results is >95% and the recall rate is >90%) .
  • the coverage information may also include information about blind areas, where the blind areas may include at least one of communication blind areas, perception blind areas, and the like.
  • the coverage areas in the coverage information may be divided according to coverage capabilities of different levels, and therefore, blind spots may also correspond to different blind spot levels. For example, the area where the correct rate of perception results is lower than 40% is regarded as a first-level perception blind area, and the area where the correct rate of perception results is lower than 10% is regarded as a second-level perception blind area.
  • the communication blind area and the perception blind area may be independent, or may be processed such as taking an intersection.
  • FIG. 9 is a schematic diagram of a possible blind area provided by the embodiment of the present application, wherein Scope 1 (Scope1) is a coverage area of the sensing device 901, and Scope 2 (Scope2) corresponds to the communication capability of the communication device 902. a coverage area of .
  • road section A is within the two coverage areas, while road sections B and C belong to perception blind spots, but they do not completely belong to communication blind spots, so vehicles or other devices located on road sections B and C can still receive sensing results from the sensing device 901 .
  • the coverage capability information of the roadside device in the at least one coverage area indicates multiple capabilities under multiple environments. For example, under different weather environments such as sunny days, rainy days, and foggy days, the coverage capability information may be different. For another example, at different times such as day and night, or under different temperature, humidity, and brightness conditions, the coverage capability information of the roadside equipment may be different.
  • the coverage information may include information for indicating applicable scenarios.
  • the coverage information includes the applicable scene field, which is used to indicate the coverage capability of the sensing device 3 in different environments.
  • the scene factors can be reasonably considered to improve the coverage of the coverage capability. range accuracy for improved reliability.
  • the coverage information may also include one or more fields in season, time period, weather, temperature, humidity, brightness, and the like.
  • the coverage information may also include one or more items of roadside device identification, tile identification (ID), and the like.
  • the tile is a component in the tile map.
  • the coverage information may include tile IDs.
  • the coverage information can be associated with the tile through the tile ID, which is convenient for updating the map with the coverage information, and convenient for storing and managing the coverage information.
  • Step S402 The data processing device stores the coverage information as map data.
  • the data processing device can directly store the obtained coverage information, or can store the coverage information after processing.
  • the processed coverage information is more in line with the storage requirements of map data, and the form may be different from the obtained coverage information, but the instruction content is consistent.
  • Storing the coverage information as map data refers to storing the coverage information in the storage medium of the cloud, roadside or terminal in the compiled form or storage format of other information in the map as information carried in the map .
  • FIG. 11 is a schematic diagram of a possible data structure of coverage information as map data provided by an embodiment of the present application.
  • the tile ID is used to identify a map tile
  • the roadside ID is used to identify a roadside device.
  • the lower layer of each roadside ID includes the information of the communication coverage area corresponding to the roadside ID, which may specifically include the default use range level and at least one level of coverage area.
  • the default usage range level is used to indicate the coverage area corresponding to which level of coverage capability is displayed by default.
  • Levels eg, primary range, secondary range, tertiary range, etc.
  • the lower layer of the range level may include the coverage area, and optionally also include the content indicated by the level (ie, the illustrated indicator).
  • the lower layer of the range level may contain the index (or content, index item) indicated by the level and the value (or value range) corresponding to the index, for example, "Indicator: correct Rate, value range: ⁇ 90”.
  • the data structure shown in Figure 11 is only an example, and in the case where the roadside device includes a plurality of sensing devices or communication devices, the lower layer of the ID of the roadside device may include IDs, Or the ID of the communication device (or communication device group).
  • FIG. 12 is a schematic structural diagram of another possible coverage information provided by the embodiment of the present application.
  • the lower layer of each roadside ID includes a sensor (or called a sensing device) ID, or a sensor Group ID etc.
  • the lower layer of the roadside device ID may include the identification of the sensor group, the identification of the sensor, and the like.
  • the lower layer of the identification of the sensor group contains the sensor list of the sensor group, the working status of each sensor in the sensor list (for example, it can be normal work, fault, etc.), default use range level, work mode (including fusion mode, single One or more of the sensor mode and other modes) and the like.
  • the default usage range level is used to indicate the coverage area corresponding to which level of coverage capability is displayed by default.
  • Levels eg, primary range, secondary range, tertiary range, etc.
  • the lower layer of the range level may include the coverage area, and optionally also include the content indicated by the level (ie, the illustrated indicator).
  • the data structure of the coverage information can include the fused perception area and
  • the coverage capability also includes the perception area and coverage capability of a single lidar, and can also include the perception area and coverage capability of a single vision sensor.
  • the data processing device may generate a first layer according to the coverage capability, and the first layer belongs to the aforementioned map.
  • FIG. 10A is a schematic diagram of a possible map layer provided by an embodiment of the present application.
  • the map shown in FIG. 10A may include a coverage layer, a road layer, a building layer, and a congestion layer. State layer and so on layer (example only).
  • the overlay information can be displayed on a display interface.
  • the map layer related to the coverage information can be displayed alone, or can be displayed on the map display interface together with other map layers in a superimposed manner.
  • FIG. 10B is a schematic diagram of a possible map provided by the embodiment of the present application. Overlaying and displaying the coverage layer, congestion state layer, road layer, and building layer can be obtained as shown in FIG. The map shown in 10B.
  • the data processing device may also update the coverage layer in the map according to the coverage information, where updating the map includes one or more of increasing the coverage area, reducing the coverage area, modifying the coverage area, and modifying capability information. For example, the coverage area under different environments is selected for display according to the environment change, or the coverage area of the faulty roadside equipment is stopped to be displayed when the roadside equipment fails.
  • the data processing device may receive update instruction information, so as to update the map.
  • the data processing device may determine the data structure corresponding to the map according to the coverage information.
  • the subsequent data processing device can update the map through the data structure corresponding to the map.
  • the data structure in FIG. 11 or FIG. 12 may include blind areas.
  • the dead zone may also include different levels, for details, please refer to the above, and details will not be repeated here.
  • the data processing device is used as a map generating device, and can send the map to other devices (vehicles, roadside equipment, assisted driving servers, etc.) after generating or updating the map including the coverage information.
  • the data processing device can also use the overlay information to perform information processing. For example, the data processing device determines one or more of the safety level of the vehicle, the driving strategy of the vehicle, and the like according to the coverage capability.
  • the data processing device determines the safety level of the vehicle according to the coverage capability.
  • the safety level of the vehicle can be used to determine the weight of the automatic driving device of the vehicle participating in the vehicle operation.
  • Table 7 is a table of possible vehicle safety levels exemplified by the embodiment of the present application. It can be seen that when the vehicle is located in the area corresponding to the first-level range, the safety level is level 1.
  • the data processing device can respond to the driving scene according to the sensing results of the roadside equipment or communication data, and the driver may not be required.
  • the expression please refer to the expression.
  • the data processing device determines the driving strategy of the vehicle according to the coverage capability.
  • the driving strategy may include one or more of the safety level, the confidence level of the perception result, whether the driver is required to take over the vehicle, whether to start automatic driving (or assisted driving), and so on.
  • the coverage capability corresponds to the first coverage area, and the vehicle can determine the coverage capability of the current roadside device according to whether the vehicle is within the first coverage area, and then adjust the driving strategy.
  • the security level of the first vehicle is determined to be a high security level.
  • increasing the confidence level of the sensing result of the roadside device in response to the first vehicle being located in the first coverage area, a first reminder message is triggered, and the first reminder message is used to remind the user to turn on the automatic driving function of the first vehicle or to turn on the first Assisted driving function of the vehicle.
  • a second reminder message is triggered, where the second reminder message is used to remind the user to take over the first vehicle.
  • This application exemplifies a possible design, please refer to Figure 7, taking the data processing device included in the vehicle 702 as an example, when the vehicle 702 is located in the perception coverage area with a correct rate of 90%, the safety level of the vehicle can be improved at this time . Similarly, when the vehicle 702 is located in the sensing coverage area with a correct rate of 90%, the confidence of the sensing result of the roadside device can be improved. Similarly, if the vehicle is probably located in the perception coverage area with a 90% accuracy rate, a first reminder message may be triggered, and the first reminder message is used to remind the user to turn on the automatic driving function of the first vehicle or turn on the first vehicle assisted driving function.
  • a second reminder message may be triggered, and the second reminder message is used to remind the user to take over the first vehicle. It can be seen that when the vehicle is located in the perception coverage area with an accuracy rate of 90%, the perception results of the roadside equipment for the vehicle 702 and the surrounding environment of the vehicle 702 are relatively accurate at this time, and at this time, increasing the confidence of the perception results can make the perception results according to Perception results can determine more reliable driving maneuvers and improve safety.
  • the data processing device can obtain the blind area according to the coverage information, so as to control the action of the vehicle.
  • FIG. 9 is a schematic diagram of a possible blind area provided by the embodiment of the present application.
  • the vehicle When the vehicle is located in the communication blind area of the communication device 902, it can actively cut off the communication connection with the communication device 902 to avoid an unstable connection. Occupy the communication and processing resources of the vehicle; when the vehicle is located in the blind spot of the sensing device 901 or the detection result required by the vehicle is located in the blind spot of the sensing device 901, the confidence of the sensing result of the sensing device 901 can be reduced or the information from the sensing device 901 can not be used. perception results.
  • the content of the coverage information is designed to meet the usage requirements for the coverage of the roadside equipment.
  • the coverage area and coverage capability of the roadside equipment can be determined according to the coverage information, and the reliability and robustness of the service provided by the roadside equipment can be obtained. sex and so on. For example, through the coverage information, the confidence of the perception results of the roadside equipment in a certain area can be obtained more accurately, or the robustness of the communication connection with the roadside equipment in a certain area, etc. Assisted driving reliability.
  • FIG. 13 is a schematic flowchart of another data processing method provided by the embodiment of the present application.
  • the data processing method shown in FIG. 13 may be applicable to the scenario shown in FIG. 1 above.
  • the data processing method may at least include the following steps:
  • Step S1301 The first data processing device generates coverage information of roadside equipment.
  • the first data processing device may be a terminal device (such as a roadside device, or a vehicle, etc.) or a network side device (such as a server, or a cloud, etc.).
  • a terminal device such as a roadside device, or a vehicle, etc.
  • a network side device such as a server, or a cloud, etc.
  • the parameters used to generate the coverage information of the roadside equipment may also be reported by the roadside equipment, may also be collected by the first data processing device itself, or may be obtained by the first data processing device according to the perception of the roadside device Results, communication results, etc. are calculated.
  • Coverage information which may also be referred to as coverage data, includes the coverage area of the roadside device, the coverage capability information of the roadside device in the coverage area, and the like. Wherein, the coverage area of the roadside equipment is within the coverage area of the roadside equipment.
  • the coverage capability specifically refers to the coverage capability of the roadside equipment in the coverage area, and the coverage capability can be described by coverage capability information. For detailed descriptions about coverage information, coverage areas, and coverage capabilities, reference may be made to relevant descriptions in step S401 , and details are not repeated here.
  • the coverage capability may indicate different indicators, or be referred to as content.
  • the coverage capability is the coverage capability of the roadside equipment in the communication coverage area, and the coverage capability information can be used to indicate at least one of the following contents (or indicators): data accuracy rate, packet loss rate, At least one of communication delay, communication stability, or signal strength, etc.
  • the coverage capability is the coverage capability of the roadside equipment in the perception coverage area, and the coverage capability information is used to indicate at least one of the following contents (or indicators): correct rate of perception results, error At least one of detection rate, missed detection rate, recall rate, perceptual precision, perceptual average precision, detection stability or detection position precision, etc.
  • the number of coverage areas included in the coverage information may be one or more.
  • the coverage capability information may also be one or more.
  • the multiple sensing coverage areas may be divided according to levels of sensing capabilities.
  • the multiple communication coverage areas may be divided according to communication capability levels.
  • the coverage information when the coverage information includes multiple coverage areas, there may be overlapping areas among the multiple coverage areas.
  • the roadside device may include one or more sensing devices, or may be connected to one or more sensing devices.
  • the perception capability of the roadside equipment can be realized through the perception equipment.
  • sensing devices may be combined, and one or more sensing devices may form a sensing device group.
  • the sensing coverage area in the coverage information may correspond to a sensing device or a sensing device group.
  • the sensing coverage area corresponding to the sensing device group and the coverage capability in the sensing coverage area are determined according to the coverage capabilities of the sensing devices in the sensing device group.
  • the coverage area in the coverage information may be obtained by fusing the coverage areas of multiple single devices.
  • the coverage information may also include information about blind areas, where the blind areas may include at least one of communication blind areas, perception blind areas, and the like.
  • the coverage capability information of the roadside device in the at least one coverage area indicates multiple capabilities under multiple environments.
  • the coverage information may also include one or more items of roadside equipment identifiers, tile IDs, and the like.
  • step S1301 For a detailed description of related concepts in step S1301, reference may be made to related descriptions in step S401, which will not be repeated here.
  • Step S1302 the first data processing device sends coverage information.
  • the first data processing device may communicate with other devices through wired, wireless links, or wired and wireless combined links, so as to send coverage information to other devices.
  • the data link for sending and receiving information between the first data processing device and other devices may include various types of connection media, specifically wired links (such as optical fibers, etc.), or wireless links, or wired links. The combination of road and wireless links and so on.
  • it can include 802.11b/g, Bluetooth (Blue Tooth), Zigbee (Zigbee), vehicle short-range wireless communication technology, Global System for Mobile communications (Global System for Mobile communications, GSM), General Packet Radio Service (General Packet Radio) Service, GPRS), Universal Mobile Telecommunications System (UMTS), Ultra Wideband (UWB) technology, vehicle wireless transmission technology, etc.
  • Bluetooth Bluetooth
  • Zigbee Zigbee
  • vehicle short-range wireless communication technology GPS for Mobile communications
  • GSM Global System for Mobile communications
  • GSM Global System for Mobile communications
  • GPRS General Packet Radio Service
  • UMTS Universal Mobile Telecommunications System
  • UWB Ultra Wideband
  • Step S1303 The second data processing device obtains the coverage information of the roadside equipment.
  • the first data processing device may send the coverage information of the roadside equipment to the second data processing device.
  • the second data processing device receives the coverage information of the roadside equipment from the first data processing device.
  • the second data processing device may be a terminal device (such as a roadside device, or a vehicle, etc.) or a network side device (such as a server, or a cloud, etc.).
  • a terminal device such as a roadside device, or a vehicle, etc.
  • a network side device such as a server, or a cloud, etc.
  • the first data processing device may be a roadside device 701
  • the second data processing device may be a vehicle 702 (or a cloud 703 ).
  • the roadside device can generate coverage information according to its own coverage capability, and can also send the coverage information to the vehicle 702 (or the cloud 703).
  • the vehicle 702 (or the cloud 703) obtains the coverage information.
  • the first data processing device may be the cloud 703, and the second data processing device may be the vehicle 702 (roadside equipment 701).
  • the first data processing device may send the coverage information to the vehicle 702 (the roadside device 701 ), and correspondingly, the vehicle 702 (the roadside device 701 ) may receive the coverage information sent by the cloud 703 .
  • the coverage information may include coverage information of the roadside device 701, and optionally also include coverage information of other roadside devices (not shown in FIG. 7 ).
  • Step S1304 The second data processing device uses the coverage information to update the map or control the action of the vehicle, that is, to generate a control signal for controlling the vehicle.
  • step S402 For details, reference may be made to related descriptions in step S402, which will not be repeated here.
  • the embodiment of the present application also provides coverage data, where the coverage data is used to describe the coverage range of the roadside equipment.
  • the coverage data includes the coverage area, the coverage capability of the roadside equipment in the coverage area, and the like.
  • the coverage area of the roadside device is within the coverage area of the roadside device, and may include a perception coverage area, a communication coverage area, and the like.
  • the coverage capability specifically refers to the coverage capability of the roadside equipment in the coverage area, and the coverage capability can be described by coverage capability information.
  • the coverage capability may indicate different indicators, or be referred to as content.
  • the coverage capability is the coverage capability of the roadside equipment in the communication coverage area
  • the coverage capability information can be used to indicate at least one of the following contents (or indicators): data accuracy rate, packet loss rate At least one of , communication delay, communication stability or signal strength, etc.
  • the coverage capability is the coverage capability of the roadside device within the perceived coverage area, and the coverage capability information is used to indicate at least one of the following contents (or indicators): correct rate of perception results, At least one of false detection rate, missed detection rate, recall rate, perceptual precision, perceptual average precision (Average Precision, AP), detection stability or detection position precision, etc.
  • the number of coverage areas included in the coverage information may be one or more.
  • the coverage capability information may also be one or more.
  • the multiple sensing coverage areas may be divided according to levels of sensing capabilities.
  • the multiple communication coverage areas may be divided according to communication capability levels.
  • the sensing coverage area in the coverage information may correspond to a sensing device or a sensing device group.
  • the sensing coverage area corresponding to the sensing device group and the coverage capability in the sensing coverage area are determined according to the coverage capabilities of the sensing devices in the sensing device group.
  • the coverage area in the coverage information may be obtained by fusing the coverage areas of multiple single devices.
  • the coverage information may also include one or more items of roadside device identification, tile ID, blind area information, roadside device ID, etc., where blind areas may include communication blind areas, perception blind areas, etc. at least one of the
  • coverage data can be divided into multiple levels for representation.
  • the first level is the roadside equipment ID
  • the next level of the roadside equipment ID (referred to as the second level for convenience of description) includes multiple range levels
  • the next level of each range level ( It is called the third level for convenience of description) including coverage capability information and coverage area (or coverage area indication information).
  • the structure of the communication coverage data may be as shown in FIG. 11 .
  • the first level is the roadside device ID
  • the next level of the roadside device ID includes the sensing device ID or sensing device group ID.
  • the next level of the sensing device ID (or sensing device group ID) (referred to as the third level for convenience of description) includes multiple range levels.
  • the lower level of the range level (referred to as the fourth level for convenience of description) includes coverage capability information and coverage area (or coverage area indication information).
  • the structure of the communication coverage data may be as shown in FIG. 12 .
  • FIG. 14 is a schematic structural diagram of a data processing device 140 (hereinafter referred to as device 140) provided by an embodiment of the present application.
  • the device 140 can be an independent device, or a device in an independent device, such as chips or integrated circuits.
  • the device 140 may be the data processing device in the embodiment shown in FIG. 4 , or a device in the data processing device, such as a chip or an integrated circuit.
  • the device 140 may be the second data processing device in the embodiment shown in FIG. 13 , or a device in the second data processing device, such as a chip or an integrated circuit.
  • the device 140 includes an acquisition unit 1401 and a storage unit 1402 .
  • the acquiring unit 1401 is configured to acquire the coverage information of the roadside equipment, the coverage information includes coverage area information indicating at least one coverage area of the roadside equipment and indicating the coverage area of the roadside equipment in the Coverage capability information of coverage capabilities in at least one coverage area.
  • the storage unit 1402 is configured to store the coverage information as map data.
  • coverage information coverage area, coverage area information, coverage capability and coverage capability information, please refer to the above description, and will not repeat them here.
  • each unit corresponds to its own program code (or program instruction).
  • program code corresponding to these units runs on the processor, the unit is controlled by the processor to execute the corresponding process to realize the corresponding function. .
  • FIG. 15 is a schematic structural diagram of a data processing device 150 (hereinafter referred to as device 150) provided by an embodiment of the present application.
  • the device 150 can be an independent device, or a device in an independent device, such as chips or integrated circuits.
  • the device 150 includes a processing unit 1501 , a storage unit 1502 , a communication unit 1503 and a display unit 1504 .
  • the processing unit 1501 is configured to generate the coverage information of the roadside equipment, where the coverage information includes coverage area information for indicating at least one coverage area of the roadside equipment and information for indicating at least one coverage area of the roadside equipment The coverage capability information of the coverage capability in the at least one coverage area; the storage unit 1502 is configured to store the coverage information generated by the processing unit 1501 as map data.
  • the communication unit 1503 is configured to receive the coverage information of the roadside equipment, the coverage information including coverage area information indicating at least one coverage area of the roadside equipment and the coverage area information indicating at least one coverage area of the roadside equipment The coverage capability information of the coverage capability in the at least one coverage area; the storage unit 1502 is configured to store the coverage information received by the communication unit 1503 as map data.
  • coverage information coverage area, coverage area information, coverage capability and coverage capability information, please refer to the above description, and will not repeat them here.
  • the display unit 1504 is configured to display the above-mentioned overlay information on a display interface.
  • the communication unit 1503 is configured to send the coverage information.
  • the processing unit 1501 is further configured to use the coverage information to generate a control signal for controlling the vehicle.
  • the processing unit 1501 is further configured to use the coverage information to perform information processing, such as determining the confidence level of the perceived information or determining the safety level of the vehicle.
  • FIG. 16 is a schematic structural diagram of a data processing device 160 provided by an embodiment of the present application.
  • the device 160 may be an independent device (such as one of a node, a terminal, etc.), or a A device, such as a chip or an integrated circuit.
  • the apparatus 160 may include at least one processor 1601 and a communication interface 1602 . Further optionally, the apparatus 160 may further include at least one memory 1603 . Further optionally, a bus 1604 may also be included, wherein the processor 1601 , the communication interface 1602 and the memory 1603 are connected through the bus 1604 .
  • the processor 1601 is a module for performing arithmetic operations and/or logical operations, specifically, a central processing unit (central processing unit, CPU), a picture processing unit (graphics processing unit, GPU), a microprocessor (microprocessor unit, MPU) ), Application Specific Integrated Circuit (ASIC), Field Programmable Logic Gate Array (Field Programmable Gate Array, FPGA), Complex Programmable Logic Device (Complex programmable logic device, CPLD), coprocessor (assisting central processing One or more combinations of processing modules such as processors to complete corresponding processing and applications), Microcontroller Unit (MCU) and other processing modules.
  • a central processing unit central processing unit, CPU
  • a picture processing unit graphics processing unit, GPU
  • microprocessor microprocessor unit, MPU
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • FPGA Field Programmable Gate Array
  • CPLD Complex Programmable Logic Device
  • coprocessor assisting central processing
  • MCU Microcontroller Unit
  • Communication interface 1602 may be used to provide information input or output to the at least one processor. And/or, the communication interface 1602 can be used to receive data sent from the outside and/or send data to the outside, and can be a wired link interface such as an Ethernet cable, or a wireless link (Wi-Fi, Bluetooth, general wireless transmission, vehicle short-range communication technology and other short-range wireless communication technologies, etc.) interface. Optionally, the communication interface 1602 may further include a transmitter (such as a radio frequency transmitter, an antenna, etc.) or a receiver coupled with the interface.
  • a transmitter such as a radio frequency transmitter, an antenna, etc.
  • communication interface 1602 may also include an antenna.
  • the electromagnetic wave is received through the antenna, and the communication interface 1602 can also frequency-modulate and filter the electromagnetic wave signal, and send the processed signal to the processor 1601 .
  • the communication interface 1602 can also receive the signal to be sent from the processor 1601, perform frequency modulation on it, amplify it, and convert it into electromagnetic wave and radiate it through the antenna.
  • the memory 1603 is used to provide a storage space, in which data such as operating systems and computer programs can be stored.
  • Memory 1603 can be random access memory (random access memory, RAM), read-only memory (read-only memory, ROM), erasable programmable read-only memory (erasable programmable read only memory, EPROM), or portable read-only memory One or more combinations of memory (compact disc read-only memory, CD-ROM), etc.
  • At least one processor 1601 in the apparatus 160 is configured to invoke a computer program stored in at least one memory 1603 to execute the aforementioned method, for example, the method described in the embodiments shown in FIG. 4 and FIG. 13 .
  • the device 160 may be the data processing device in the embodiment shown in FIG. 4 , or a device in the data processing device, such as a chip or an integrated circuit.
  • the device 160 may be the second data processing device in the embodiment shown in FIG. 13 , or a device in the second data processing device, such as a chip or an integrated circuit.
  • the embodiment of the present application also provides a terminal, the terminal is used to implement the method described in the embodiment shown in FIG. 4 or FIG. 13 .
  • the terminals include but are not limited to vehicles or portable terminals.
  • the terminal includes the aforementioned device, for example, the device shown in FIG. 14 , FIG. 15 or FIG. 16 .
  • the embodiment of the present application also provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is run on one or more processors, the implementation of FIG. 4 or FIG. 13 is realized.
  • the illustrated example describes the method.
  • the embodiment of the present application also provides a computer program product, and when the computer program product runs on one or more processors, the method described in the embodiment shown in FIG. 4 is implemented.
  • An embodiment of the present application also provides a chip system, the chip system includes a communication interface and at least one processor, the communication interface is used to provide information input/output for the at least one processor, and/or, the communication interface Used to send or receive data.
  • the processor is configured to call a computer program (or computer instruction) to implement the method described in the embodiment shown in FIG. 4 or FIG. 13 .
  • the computer program in the memory in this application can be stored in advance or can be stored after being downloaded from the Internet when using the device.
  • This application does not specifically limit the source of the computer program in the memory.
  • the coupling in the embodiments of the present application is an indirect coupling or connection between devices, units or modules, which may be in electrical, mechanical or other forms, and is used for information exchange between devices, units or modules.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server, or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state hard disk), etc.
  • the processes can be completed by computer programs to instruct related hardware.
  • the programs can be stored in computer-readable storage media.
  • When the programs are executed may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请实施例公开了一种数据处理方法及装置,应用于电子地图领域。通过在地图中存储路侧设备的覆盖信息,丰富了地图的内容,使地图能够满足用户更高层次的使用需求。其中,覆盖信息包括用于指示路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示路侧设备在至少一个覆盖区域内的覆盖能力的覆盖能力信息。该覆盖信息能够用于生成控制车辆的控制信号,从而可以提高自动驾驶或者辅助驾驶的安全性。

Description

数据处理方法及装置
本申请要求于2021年08月27日提交中国专利局、申请号为202110996319.4、申请名称为“数据处理方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子地图领域,具体涉及数据处理方法及装置。
背景技术
随着社会的发展,智能汽车正在逐步进入人们的日常生活中。传感器在智能汽车的辅助驾驶和自动驾驶中发挥着十分重要的作用。安装在车上的各式各样的传感器,比如毫米波雷达、激光雷达、超声波雷达或摄像头等,可以在汽车行驶过程中感知周围的环境、辨识与追踪移动物体、以及识别静止场景(如车道线、标示牌)。总之来说,传感器可以预先察觉到可能发生的危险并及时提醒驾驶员,或辅助驾驶员或自动采取措施避免危险发生,有效增加了汽车驾驶的安全性和舒适性。
高精地图(High-Definition Map,HD MAP),又称为高清地图或者高精度地图,作为实现自动驾驶的关键能力之一,将成为对自动驾驶现有传感器的有效补充,提升车辆的自动驾驶决策的安全性。与传统的导航地图相比,服务于自动驾驶的高精地图在各方面要求更高,并能配合传感器和算法,为决策层提供支持。由于自动驾驶过程中,外界会动态地发生变化影响到车辆的行驶,因此,高精地图除静态图层外,越来越需要更多的动态信息,以满足交通领域的发展需求。然而,现有的地图内容丰富程度还不能充分满足未来使用的需求。
发明内容
本申请实施例提供一种数据处理方法及装置,在地图中增加了一种新型的地图信息,即路侧设备的覆盖信息,提高了地图信息的丰富程度,能够满足更高层次的地图使用需求。
第一方面,本申请实施例提供一种数据处理方法,该方法包括:
获取路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息;
将所述覆盖信息存储为地图数据。
本申请实施例在地图中对路侧设备的覆盖信息进行了维护,满足了用户的使用需求。后续在其他设备使用路侧设备提供的信息时,可以从地图中获得该路侧设备的覆盖区域和在该覆盖区域内的覆盖能力,为如何使用路侧设备提供的信息提供参考。例如通过覆盖信息,可以更准确地确定路侧设备在某一区域内的感知结果的置信度,或者确定在某一区域内与路侧设备之间通信连接的鲁棒性等指标,提升了自动驾驶或者辅助驾驶的可靠性。
其中,获取路侧设备的覆盖信息,可以是生成路侧设备的覆盖信息。由于云端设备、路侧设备和终端设备都具有信息生成能力,因此该情况下该方法的执行主体可以为云端设备、路侧设备或终端设备,包括但不限于云端的地图服务器、应用服务器、路侧单元(Road Side Unit,RSU)、边缘处理器(multi-access edge computing,MEC)、车辆或便携终端等设备,或者这些设备内的部件、芯片、软件模块或硬件模块。
获取路侧设备的覆盖信息,也可以是接收路侧设备的覆盖信息。一种情况所述接收是设备之间基于无线通信或有线通信的接收操作,该情况下该方法的执行主体可以为云端设备、路侧设备或终端设备,包括但不限于云端的地图服务器、应用服务器、路侧单元(Road Side Unit,RSU)、边缘处理器(multi-access edge computing,MEC)、车辆或便携终端等设备,应用场景包括但不限于车车之间、路路之间、车云之间或者车路之间的信息传递;还有一种情况是设备内基于总线、走线、接口或参数被模块调用的接收操作,该情况下该方法的执行主体可以为上述设备内的部件、芯片、软件模块或硬件模块。
其中,将所述覆盖信息存储为地图数据,是指将该覆盖信息作为一种在地图中承载的信息,采用地图中其他信息的编译形式或存储格式,存储于地图数据库中。本方法的执行主体可以位于云端、路侧或终端,该地图数据可以相应地存储于云端、路侧或终端的存储介质中。
在又一种可能的实施方式中,利用所述覆盖信息,生成地图或者更新地图,包括:
根据所述覆盖信息生成或更新地图中的一个图层。进一步的,所述地图可以为高精地图。
在又一种可能的实施方式中,所述覆盖信息还包括瓦片的标识。
通过瓦片的标识可以将覆盖信息关联到瓦片,可以便于利用地图数据的管理方式维护所述覆盖信息。
其中,瓦片可以理解为:将一定范围内的地图按照一定的尺寸和格式,以及不同的地图分辨率,切成若干行和列的矩形栅格图片,对切片后的矩形栅格图片称为瓦片(Tile)。地图分辨率越高,意味着切割次数越多,则组成该地图的瓦片数量就越多,瓦片的等级也越高。当切割方式为十字切割时,则某一等级的瓦片是由对应的高一级别的4个瓦片组成。
例如,瓦片1是地图中某一等级的瓦片,对瓦片1进行十字切割可进一步生成比瓦片1的等级高一级别的4块瓦片,标识分别为1-00、1-01、1-10和1-11。可以理解,瓦片1的地理覆盖范围为瓦片1-00的地理覆盖范围、瓦片1-01的地理覆盖范围、瓦片1-10的地理覆盖范围和瓦片1-11的地理覆盖范围的并集。
在又一种可能的实施方式中,所述覆盖信息还包括所述路侧设备的标识。
在一种可能的实施方式中,所述至少一个覆盖区域包括M个通信覆盖区域和N个感知覆盖区域,其中,所述M和所述N为自然数,且所述M和所述N不同时为0。
也就是说,覆盖区域可以包括一个或者多个通信覆盖区域,也可以包括一个或者多个感知覆盖区域,还可以既包括通信覆盖区域也包括感知覆盖区域。
其中,通信覆盖区域用于体现路侧设备的通信能力,感知覆盖区域用于体现路侧设备的感知能力。
在又一种可能的实施方式中,所述至少一个覆盖区域根据所述至少一个覆盖区域的覆盖能力按等级划分。具体来说,所述覆盖信息包括用于指示所述M个通信覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述M个通信覆盖区域内的覆盖能力的覆盖能力信息,所述M个通信覆盖区域根据所述M个通信覆盖区域的覆盖能力按等级划分,其中M大于1。
上述说明了包括多个通信覆盖区域的可能情况,由于在不同的区域所对应的覆盖能力不同,按照不同的覆盖能力的等级划分得到不同的通信覆盖区域,便于确定能力边界。另外,使用多个根据能力等级划分的通信覆盖区域,可以使得覆盖信息的结构更清晰,便于管理和使用。
在又一种可能的实施方式中,所述覆盖信息包括用于指示所述N个感知覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述N个感知覆盖区域内的覆盖能力的覆盖能力信息,所述N个感知覆盖区域根据所述N个感知覆盖区域的覆盖能力按等级划分,其中N大于1。
上述说明了包括多个感知覆盖区域的可能情况,可以参考前述对于通信覆盖区域的描述。
在又一种可能的实施方式中,所述N个感知覆盖区域中的至少一个感知覆盖区域对应感知设备组,体现在所述N个感知覆盖区域包括多设备感知覆盖区域,所述多设备感知覆盖区域和所述路侧设备在所述多设备感知覆盖区域内的覆盖能力是根据与所述路侧设备相关的多个感知设备的覆盖能力确定的。
上述说明了对于感知覆盖区域的可能设计。与所述路侧设备相关的多个感知设备,可以是路侧设备内包含的多个感知设备,或者是向所述路侧设备发送感知信息的多个感知设备相关联。路侧设备的感知覆盖区域可以对应独立的一个感知设备,也可以对应感知设备组。其中,感知设备组内包含一个或者多个与所述路侧设备相关的感知设备。
例如,感知设备组可以包含激光雷达和相机,可以对该激光雷达和相机感知到的信息进行融合,得到关于融合后的感知能力的感知覆盖区域和对应该感知覆盖区域的覆盖能力。
在又一种可能的实施方式中,所述N个感知覆盖区域分别对应N个感知设备组;所述覆盖信息还包括:感知设备组的标识。
可以看出,感知覆盖区域对应感知设备组的情况下,覆盖信息中可以包括感知设备组的标识,从而使得感知覆盖信息结构更加清晰,便于使用和管理。
在又一种可能的实施方式中,所述N个感知覆盖区域分别对应N个感知设备;所述覆盖信息还包括:感知设备的标识。
在又一种可能的实施方式中,所述N个感知覆盖区域中,一部分感知区域对应感知设备组,另一部分对应感知设备;所述覆盖信息还包括:感知设备组的标识和感知设备的标识。
例如,感知设备组的覆盖能力可以是根据多个感知设备的感知能力融合得到的。将融合后的覆盖能力,按照等级划分区域,从而得到对应该感知设备组的覆盖区域。
再如,感知设备组的覆盖区域可以是根据多个子覆盖区域的并集部分得到的,每一个子覆盖区域可以对应感知设备组中的一个感知设备。
在又一种可能的实施方式中,所述路侧设备与第一感知设备和第二感知设备相关,所述N个感知覆盖区域包括所述第一感知设备的第一覆盖区域和所述第二感知设备的第二覆盖区域,所述覆盖能力信息包括用于指示所述第一感知设备在所述第一覆盖区域内的覆盖能力的第一覆盖能力信息和用于指示所述第二感知设备在所述第二覆盖区域内的覆盖能力的第二覆盖能力信息。
在又一种可能的实施方式中,所述覆盖信息还包括盲区的信息,所述盲区包括通信盲区,感知盲区,或通信盲区和感知盲区。
在又一种可能的实施方式中,所述覆盖能力为所述路侧设备在通信覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
数据正确率、丢包率、通信时延、通信稳定性和信号强度。
本申请实施例中示例性地给出了几种覆盖能力信息指示的内容(或者说指标),通过覆盖能力信息来指示上述的一种或者多种内容,可以提高覆盖信息的设计合理性,从而便于后续使用。
例如,车辆行驶过程中可以随时与路侧设备进行通信,根据路侧设备的通信稳定性可以指示通信情况,便于及时规划、调整与路侧设备之间的通信需求。
在又一种可能的实施方式中,所述覆盖能力为所述路侧设备在感知覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
感知结果正确率、误检率、漏检率、召回率、感知精度、检测稳定性和检测位置精度。
例如,自动驾驶的策略离不开感知结果,此时根据感知能力指示的感知结果正确率、召回率来确定感知结果的置信度,可以提高自动驾驶的策略的可靠性。
在又一种可能的实施方式中,所述覆盖能力信息指示多种环境下的所述覆盖能力。
例如,晴天、雨天、雾霾天气等不同的环境下,覆盖能力对应的覆盖区域可以是不同的。再如,白天、夜晚等不同的时刻,不同的温度、湿度、亮度条件下,覆盖能力对应的区域可以不同。通过多种环境下的多种能力对应的覆盖区域,在后续使用覆盖信息时,可以合理地考虑场景因素,提高覆盖信息的准确性。
在又一种可能的实施方式中,所述覆盖区域为道路的区间段或车道的区间段,更便于基于覆盖信息辅助驾驶。
在又一种可能的实施方式中,将该覆盖信息在显示界面上进行显示。所述显示界面包括但不限于车辆上的显示屏、便携终端上的显示屏或者投影显示的屏幕。通过在显示界面上显示所述覆盖信息,可以使用户直观的了解路侧设备的覆盖能力。所述显示方式可以为图形界面显示,例如在地图显示界面上叠加显示所述覆盖区域,或者进一步还显示与覆盖区域对应的覆盖能力;所述显示方式还可以为文字显示。
在又一种可能的实施方式中,发送所述覆盖信息。地图生成侧设备可以将该覆盖信息承载于地图数据包发送给地图使用侧设备。
在又一种可能的实施方式中,利用所述覆盖信息,进行信息处理或者生成用于控制所述车辆的控制信号。例如:
当车辆位于某个覆盖区域时,根据覆盖信息指示的该覆盖区域以及该覆盖区域内的覆盖能力,确定所述车辆的安全等级;或者确定来自所述路侧设备的感知结果的置信度;或者触发第一提醒消息以提醒用户开启所述车辆的自动驾驶功能或者开启所述车辆的辅助驾驶功能;或者触发第二提醒消息以提醒用户接管所述车辆。
上述例举了多种可能的实施情况,可以看出,根据路侧设备的覆盖信息,可以更准确地确定感知结果(或者通信数据)的可靠性。在驾驶过程中,可以根据覆盖信息执行多种信息处理操作或车辆控制操作,从而提高驾驶的安全性。
第二方面,本申请实施例提供的一种数据处理装置,包括:
获取单元,用于获取路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息;
存储单元,用于将所述覆盖信息存储为地图数据。
本申请实施例在地图中对路侧设备的覆盖信息进行了维护,满足了用户的使用需求。后续在其他设备使用路侧设备提供的信息时,可以从地图中获得该路侧设备的覆盖区域和在该覆盖区域内的覆盖能力,为如何使用路侧设备提供的信息提供参考。例如通过覆盖信息,可以更准确地确定路侧设备在某一区域内的感知结果的置信度,或者确定在某一区域内与路侧设备之间通信连接的鲁棒性等指标,提升了自动驾驶或者辅助驾驶的可靠性。
其中,获取单元可以是用于生成路侧设备的覆盖信息的处理单元。由于云端设备、路侧设备和终端设备都具有信息生成能力,因此该情况下该数据处理装置可以为云端设备、路侧设备或终端设备,包括但不限于云端的地图服务器、应用服务器、路侧单元(Road Side Unit,RSU)、边缘处理器(multi-access edge computing,MEC)、车辆或便携终端等设备,或者这些设备内的部件、芯片、软件模块或硬件模块。
获取单元也可以是用于接收路侧设备的覆盖信息的通信单元。一种情况所述接收是设备之间基于无线通信或有线通信的接收操作,该情况下数据处理装置可以为云端设备、路侧设备或终端设备,包括但不限于云端的地图服务器、应用服务器、路侧单元(Road Side Unit,RSU)、边缘处理器(multi-access edge computing,MEC)、车辆或便携终端等设备,应用场景包括但不限于车车之间、路路之间、车云之间或者车路之间的信息传递;还有一种情况是设备内基于总线、走线、接口或参数被模块调用的接收操作,该情况下数据处理装置可以为上述设备内的部件、芯片、软件模块或硬件模块。
其中,将所述覆盖信息存储为地图数据,是指将该覆盖信息作为一种在地图中承载的信息,采用地图中其他信息的编译形式或存储格式,存储于地图数据库中。本方法的执行主体可以位于云端、路侧或终端,该地图数据可以相应地存储于云端、路侧或终端的存储介质中。
在又一种可能的实施方式中,所述装置中包括的处理单元利用所述覆盖信息,生成地图或者更新地图。具体可以为根据所述覆盖信息生成或更新地图中的一个图层。进一步的,所述地图可以为高精地图。
在又一种可能的实施方式中,所述覆盖信息还包括瓦片的标识。
通过瓦片的标识可以将覆盖信息关联到瓦片,可以便于利用地图数据的管理方式维护所述覆盖信息。
在又一种可能的实施方式中,所述覆盖信息还包括所述路侧设备的标识。
在一种可能的实施方式中,所述至少一个覆盖区域包括M个通信覆盖区域和N个感知覆盖区域,其中,所述M和所述N为自然数,且所述M和所述N不同时为0。
也就是说,覆盖区域可以包括一个或者多个通信覆盖区域,也可以包括一个或者多个感知覆盖区域,还可以既包括通信覆盖区域也包括感知覆盖区域。
其中,通信覆盖区域用于体现路侧设备的通信能力,感知覆盖区域用于体现路侧设备的感知能力。
在又一种可能的实施方式中,所述至少一个覆盖区域根据所述至少一个覆盖区域的覆盖能力按等级划分。具体来说,所述覆盖信息包括用于指示所述M个通信覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述M个通信覆盖区域内的覆盖能力的覆盖能力信息,所述M个通信覆盖区域根据所述M个通信覆盖区域的覆盖能力按等级划分,其中M大于1。
上述说明了包括多个通信覆盖区域的可能情况,由于在不同的区域所对应的覆盖能力不同,按照不同的覆盖能力的等级划分得到不同的通信覆盖区域,便于确定能力边界。另外,使用多个根据能力等级划分的通信覆盖区域,可以使得覆盖信息的结构更清晰,便于管理和使用。
在又一种可能的实施方式中,所述覆盖信息包括用于指示所述N个感知覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述N个感知覆盖区域内的覆盖能力的覆盖能力信息,所述N个感知覆盖区域根据所述N个感知覆盖区域的覆盖能力按等级划分,其中N大于1。
上述说明了包括多个感知覆盖区域的可能情况,可以参考前述对于通信覆盖区域的描述。
在又一种可能的实施方式中,所述N个感知覆盖区域中的至少一个感知覆盖区域对应感知设备组,体现在所述N个感知覆盖区域包括多设备感知覆盖区域,所述多设备感知覆盖区域和所述路侧设备在所述多设备感知覆盖区域内的覆盖能力是根据与所述路侧设备相关的多个感知设备的覆盖能力确定的。
上述说明了对于感知覆盖区域的可能设计。与所述路侧设备相关的多个感知设备,可以是路侧设备内包含的多个感知设备,或者是向所述路侧设备发送感知信息的多个感知设备相 关联。路侧设备的感知覆盖区域可以对应独立的一个感知设备,也可以对应感知设备组。其中,感知设备组内包含一个或者多个与所述路侧设备相关的感知设备。
例如,感知设备组可以包含激光雷达和相机,可以对该激光雷达和相机感知到的信息进行融合,得到关于融合后的感知能力的感知覆盖区域和对应该感知覆盖区域的覆盖能力。
在又一种可能的实施方式中,所述N个感知覆盖区域分别对应N个感知设备组;所述覆盖信息还包括:感知设备组的标识。
可以看出,感知覆盖区域对应感知设备组的情况下,覆盖信息中可以包括感知设备组的标识,从而使得感知覆盖信息结构更加清晰,便于使用和管理。
在又一种可能的实施方式中,所述N个感知覆盖区域分别对应N个感知设备;所述覆盖信息还包括:感知设备的标识。
在又一种可能的实施方式中,所述N个感知覆盖区域中,一部分感知区域对应感知设备组,另一部分对应感知设备;所述覆盖信息还包括:感知设备组的标识和感知设备的标识。
例如,感知设备组的覆盖能力可以是根据多个感知设备的感知能力融合得到的。将融合后的覆盖能力,按照等级划分区域,从而得到对应该感知设备组的覆盖区域。
再如,感知设备组的覆盖区域可以是根据多个子覆盖区域的并集部分得到的,每一个子覆盖区域可以对应感知设备组中的一个感知设备。
在又一种可能的实施方式中,所述路侧设备与第一感知设备和第二感知设备相关,所述N个感知覆盖区域包括所述第一感知设备的第一覆盖区域和所述第二感知设备的第二覆盖区域,所述覆盖能力信息包括用于指示所述第一感知设备在所述第一覆盖区域内的覆盖能力的第一覆盖能力信息和用于指示所述第二感知设备在所述第二覆盖区域内的覆盖能力的第二覆盖能力信息。
在又一种可能的实施方式中,所述覆盖信息还包括盲区的信息,所述盲区包括通信盲区,感知盲区,或通信盲区和感知盲区。
在又一种可能的实施方式中,所述覆盖能力为所述路侧设备在通信覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
数据正确率、丢包率、通信时延、通信稳定性和信号强度。
本申请实施例中示例性地给出了几种覆盖能力信息指示的内容(或者说指标),通过覆盖能力信息来指示上述的一种或者多种内容,可以提高覆盖信息的设计合理性,从而便于后续使用。
例如,车辆行驶过程中可以随时与路侧设备进行通信,根据路侧设备的通信稳定性可以指示通信情况,便于及时规划、调整与路侧设备之间的通信需求。
在又一种可能的实施方式中,所述覆盖能力为所述路侧设备在感知覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
感知结果正确率、误检率、漏检率、召回率、感知精度、检测稳定性和检测位置精度。
例如,自动驾驶的策略离不开感知结果,此时根据感知能力指示的感知结果正确率、召回率来确定感知结果的置信度,可以提高自动驾驶的策略的可靠性。
在又一种可能的实施方式中,所述覆盖能力信息指示多种环境下的所述覆盖能力。
例如,晴天、雨天、雾霾天气等不同的环境下,覆盖能力对应的覆盖区域可以是不同的。再如,白天、夜晚等不同的时刻,不同的温度、湿度、亮度条件下,覆盖能力对应的区域可以不同。通过多种环境下的多种能力对应的覆盖区域,在后续使用覆盖信息时,可以合理地考虑场景因素,提高覆盖信息的准确性。
在又一种可能的实施方式中,所述覆盖区域为道路的区间段或车道的区间段,更便于基于覆盖信息辅助驾驶。
在又一种可能的实施方式中,所述装置包括显示单元,将该覆盖信息在显示界面上进行显示。所述显示界面包括但不限于车辆上的显示屏、便携终端上的显示屏或者投影显示的屏幕。通过在显示界面上显示所述覆盖信息,可以使用户直观的了解路侧设备的覆盖能力。所述显示方式可以为图形界面显示,例如在地图显示界面上叠加显示所述覆盖区域,或者进一步还显示与覆盖区域对应的覆盖能力;所述显示方式还可以为文字显示。
在又一种可能的实施方式中,所述装置包括通信单元,用于发送所述覆盖信息。地图生成侧设备可以将该覆盖信息承载于地图数据包发送给地图使用侧设备。
在又一种可能的实施方式中,所述装置包括处理单元,用一个利用所述覆盖信息,进行信息处理或者生成用于控制所述车辆的控制信号。例如:
当车辆位于某个覆盖区域时,根据覆盖信息指示的该覆盖区域以及该覆盖区域内的覆盖能力,确定所述车辆的安全等级;或者确定来自所述路侧设备的感知结果的置信度;或者触发第一提醒消息以提醒用户开启所述车辆的自动驾驶功能或者开启所述车辆的辅助驾驶功能;或者触发第二提醒消息以提醒用户接管所述车辆。
第三方面,本申请实施例提供一种数据处理装置,该装置可以包括处理器,用于实现上述第一方面或上述第一方面任一种可能的实现方式描述的数据处理方法。
在一种可能的实施方式中,该装置还可以包括存储器,该存储器与处理器耦合,处理器执行存储器中存储的计算机程序时,可以实现上述第一方面或上述第一方面任一种可能的实现方式描述的数据处理方法。
在又一种可能的实施方式中,该装置还可以包括通信接口,所述通信接口用于接收计算机执行指令并传输至所述处理器,所述处理器用于执行所述计算机执行指令,以使所述数据处理装置执行上述第一方面或上述第一方面任一种可能的实现方式描述的数据处理方法。
需要说明的是,本申请实施例中存储器中的计算机程序可以预先存储也可以使用该设备时从网络下载后存储,本申请实施例对于存储器中计算机程序的来源不进行具体限定。本申请实施例中的耦合是装置、单元或模块之间的间接耦合或连接,其可以是电性、机械或其它的形式,用于装置、单元或模块之间的信息交互。
第四方面,本申请实施例提供一种计算机可读存储介质,上述计算机可读存储介质存储有计算机程序,上述计算机程序被处理器执行以实现上述第一方面或上述第一方面任一种可能的实现方式描述的数据处理方法。
第五方面,本申请实施例提供一种计算机程序产品,当上述计算机程序产品被处理器读取并执行时,上述第一方面或上述第一方面任一种可能的实现方式描述的数据处理方法将被执行。
上述第三方面至第五方面提供的方案,用于实现或配合实现上述第一方面提供的方法,因此可以与第一方面达到相同或相应的有益效果,此处不再进行赘述。
第六方面,本申请实施例提供一种车辆,该车辆包括上述第二方面或上述第二方面任一种可能的实现方式描述的数据处理装置,或者包括上述第三方面或上述第三方面任一种可能的实现方式描述的数据处理装置。
第七方面,本申请实施例提供一种地图,所述地图包括路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息。
本发明实施例中的地图为地图产品,具体来说,可以是承载地图信息的地图数据产品,如地图更新数据包;或者可以为加载地图信息的地图应用产品,如可安装于车辆或便携终端上的地图应用程序;或者还可以为呈现地图信息的地图展示产品,如纸质地图或者电子导航仪。
本申请实施例在地图中对路侧设备的覆盖信息进行了维护,满足了用户的使用需求。后续在其他设备使用路侧设备提供的信息时,可以从地图中获得该路侧设备的覆盖区域和在该覆盖区域内的覆盖能力,为如何使用路侧设备提供的信息提供参考。例如通过覆盖信息,可以更准确地确定路侧设备在某一区域内的感知结果的置信度,或者确定在某一区域内与路侧设备之间通信连接的鲁棒性等指标,提升了自动驾驶或者辅助驾驶的可靠性。
进一步的,所述地图可以为高精地图。
在又一种可能的实施方式中,所述覆盖信息还包括瓦片的标识。
通过瓦片的标识可以将覆盖信息关联到瓦片,可以便于利用地图数据的管理方式维护所述覆盖信息。
在又一种可能的实施方式中,所述覆盖信息还包括所述路侧设备的标识。
在一种可能的实施方式中,所述至少一个覆盖区域包括M个通信覆盖区域和N个感知覆盖区域,其中,所述M和所述N为自然数,且所述M和所述N不同时为0。
也就是说,覆盖区域可以包括一个或者多个通信覆盖区域,也可以包括一个或者多个感知覆盖区域,还可以既包括通信覆盖区域也包括感知覆盖区域。
其中,通信覆盖区域用于体现路侧设备的通信能力,感知覆盖区域用于体现路侧设备的感知能力。
在又一种可能的实施方式中,所述至少一个覆盖区域根据所述至少一个覆盖区域的覆盖能力按等级划分。具体来说,所述覆盖信息包括用于指示所述M个通信覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述M个通信覆盖区域内的覆盖能力的覆盖能力信息,所述M个通信覆盖区域根据所述M个通信覆盖区域的覆盖能力按等级划分,其中M大于1。
上述说明了包括多个通信覆盖区域的可能情况,由于在不同的区域所对应的覆盖能力不同,按照不同的覆盖能力的等级划分得到不同的通信覆盖区域,便于确定能力边界。另外,使用多个根据能力等级划分的通信覆盖区域,可以使得覆盖信息的结构更清晰,便于管理和使用。
在又一种可能的实施方式中,所述覆盖信息包括用于指示所述N个感知覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述N个感知覆盖区域内的覆盖能力的覆盖能力信息,所述N个感知覆盖区域根据所述N个感知覆盖区域的覆盖能力按等级划分,其中N大于1。
上述说明了包括多个感知覆盖区域的可能情况,可以参考前述对于通信覆盖区域的描述。
在又一种可能的实施方式中,所述N个感知覆盖区域中的至少一个感知覆盖区域对应感知设备组,体现在所述N个感知覆盖区域包括多设备感知覆盖区域,所述多设备感知覆盖区域和所述路侧设备在所述多设备感知覆盖区域内的覆盖能力是根据与所述路侧设备相关的多个感知设备的覆盖能力确定的。
上述说明了对于感知覆盖区域的可能设计。与所述路侧设备相关的多个感知设备,可以 是路侧设备内包含的多个感知设备,或者是向所述路侧设备发送感知信息的多个感知设备相关联。路侧设备的感知覆盖区域可以对应独立的一个感知设备,也可以对应感知设备组。其中,感知设备组内包含一个或者多个与所述路侧设备相关的感知设备。
例如,感知设备组可以包含激光雷达和相机,可以对该激光雷达和相机感知到的信息进行融合,得到关于融合后的感知能力的感知覆盖区域和对应该感知覆盖区域的覆盖能力。
在又一种可能的实施方式中,所述N个感知覆盖区域分别对应N个感知设备组;所述覆盖信息还包括:感知设备组的标识。
可以看出,感知覆盖区域对应感知设备组的情况下,覆盖信息中可以包括感知设备组的标识,从而使得感知覆盖信息结构更加清晰,便于使用和管理。
在又一种可能的实施方式中,所述N个感知覆盖区域分别对应N个感知设备;所述覆盖信息还包括:感知设备的标识。
在又一种可能的实施方式中,所述N个感知覆盖区域中,一部分感知区域对应感知设备组,另一部分对应感知设备;所述覆盖信息还包括:感知设备组的标识和感知设备的标识。
例如,感知设备组的覆盖能力可以是根据多个感知设备的感知能力融合得到的。将融合后的覆盖能力,按照等级划分区域,从而得到对应该感知设备组的覆盖区域。
再如,感知设备组的覆盖区域可以是根据多个子覆盖区域的并集部分得到的,每一个子覆盖区域可以对应感知设备组中的一个感知设备。
在又一种可能的实施方式中,所述路侧设备与第一感知设备和第二感知设备相关,所述N个感知覆盖区域包括所述第一感知设备的第一覆盖区域和所述第二感知设备的第二覆盖区域,所述覆盖能力信息包括用于指示所述第一感知设备在所述第一覆盖区域内的覆盖能力的第一覆盖能力信息和用于指示所述第二感知设备在所述第二覆盖区域内的覆盖能力的第二覆盖能力信息。
在又一种可能的实施方式中,所述覆盖信息还包括盲区的信息,所述盲区包括通信盲区,感知盲区,或通信盲区和感知盲区。
在又一种可能的实施方式中,所述覆盖能力为所述路侧设备在通信覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
数据正确率、丢包率、通信时延、通信稳定性和信号强度。
本申请实施例中示例性地给出了几种覆盖能力信息指示的内容(或者说指标),通过覆盖能力信息来指示上述的一种或者多种内容,可以提高覆盖信息的设计合理性,从而便于后续使用。
例如,车辆行驶过程中可以随时与路侧设备进行通信,根据路侧设备的通信稳定性可以指示通信情况,便于及时规划、调整与路侧设备之间的通信需求。
在又一种可能的实施方式中,所述覆盖能力为所述路侧设备在感知覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
感知结果正确率、误检率、漏检率、召回率、感知精度、检测稳定性和检测位置精度。
例如,自动驾驶的策略离不开感知结果,此时根据感知能力指示的感知结果正确率、召回率来确定感知结果的置信度,可以提高自动驾驶的策略的可靠性。
在又一种可能的实施方式中,所述覆盖能力信息指示多种环境下的所述覆盖能力。
例如,晴天、雨天、雾霾天气等不同的环境下,覆盖能力对应的覆盖区域可以是不同的。再如,白天、夜晚等不同的时刻,不同的温度、湿度、亮度条件下,覆盖能力对应的区域可以不同。通过多种环境下的多种能力对应的覆盖区域,在后续使用覆盖信息时,可以合理地 考虑场景因素,提高覆盖信息的准确性。
在又一种可能的实施方式中,所述覆盖区域为道路的区间段或车道的区间段,更便于基于覆盖信息辅助驾驶。
第八方面,本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质存储有上述第七方面或上述第七方面任一种实现方式中的地图。
附图说明
下面将对本申请实施例中所需要使用的附图作介绍。
图1是本申请实施例适用的一种应用场景示意图;
图2是本申请实施例提供的一种感知覆盖范围的示意图;
图3是本申请实施例提供的一种通信覆盖范围的示意图;
图4是本申请实施例提供的一种数据处理方法的流程示意图;
图5A是本申请实施例提供的一种指示覆盖区域的方法示意图;
图5B是本申请实施例提供的又一种指示覆盖区域的方法示意图;
图6是本申请实施例提供的又一种覆盖区域的指示方法的示意图;
图7是本申请实施例提供的一种数据处理方法的使用场景示意图;
图8A是本申请实施例提供的又一种场景示意图;
图8B是本申请实施例提供的一种覆盖区域的示意图;
图9是本申请实施例提供的一种盲区的示意图;
图10A是本申请实施例提供的一种地图图层的示意图;
图10B是本申请实施例提供的一种地图的示意图;
图11是本申请实施例提供的一种覆盖信息的数据结构示意图;
图12是本申请实施例提供的又一种覆盖信息的数据结构示意图;
图13是本申请实施例提供的又一种数据处理方法的流程示意图;
图14是本申请实施例提供的一种数据处理装置的结构示意图;
图15是本申请实施例提供的又一种数据处理装置的结构示意图;
图16是本申请实施例提供的再一种数据处理装置的结构示意图;
图17示出本申请实施例提供的感知能力信息生成方法的流程图;
图18A示出本申请实施例提供的通信系统的结构示意图;
图18B示出本申请实施例提供的通信系统的结构示意图;
图18C示出本申请实施例提供的通信系统的结构示意图;
图19A示出本申请实施例中第一组位置点及对应轨迹的示意图;
图19B示出本申请实施例中第二组位置点以及对应轨迹的示意图;
图19C示出本申请实施例中匹配结果的示意图;
图19D示出本申请实施例中轨迹匹配的示意图;
图20A示出本申请实施例中待划分区域的示例性示意图;
图20B示出本申请实施例中网格的示例性示意图;
图20C示出了本申请实施例中网格的合并结果图;
图21示出本申请实施例的感知盲区的示例性示意图;
图22示出本申请实施例提供的通信能力信息生成方法的流程图;
图23示出本申请实施例提供的通信系统的结构示意图;
图24示出了第一分布情况的示例性示意图;
图25示出本申请实施例提供的通信系统的结构示意图;
图26示出终端设备的分布情况示意图;
图27示出本申请实施例中网格的示例性示意图;
图28示出本申请实施例中网格的合并结果的示例性示意图;
图29示出本申请实施例中网格的示例性示意图;
图30示出本申请实施例中网格的合并结果的示例性示意图;
图31示出本申请实施例的通信盲区的示例性示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例进行描述。
请参见图1,图1是本申请实施例适用的一种应用场景示意图,具体来说也是一种通信系统的示意图,包括路侧设备、车辆和服务端。其中:
(1)路侧设备,可以设置于路边(或者说路口、路侧);路侧设备可以与服务端和/或车辆通信,实现多种功能服务,例如,路侧设备采集周边信息提供给服务端和/或车辆,路侧设备为车辆提供例如车辆身份识别,电子收费,电子扣分等中的一项或多项服务。路侧设备中可以包含感知设备(或称为传感器)和/或通信设备。
其中,路侧设备的感知设备可以对周边信息(例如,道路信息)进行采集,进而提供车路协同服务。可选的,感知设备可以包含毫米波雷达、激光雷达或视觉传感器(例如摄影机等)等中的一项或者多项。
路侧设备具有一定的覆盖范围,该覆盖范围表征了路侧设备能够提供的服务的区域。进一步地,覆盖范围可以包括感知覆盖范围、通信覆盖范围等范围中的至少一种。
示例性地,以路侧设备包括激光雷达为例,路侧设备可以通过激光雷达进行目标探测,该路侧设备的激光雷达的视野范围可以看作是该路侧设备的一种感知覆盖范围。例如,请参见图2,图2是本申请实施例提供的一种可能的感知覆盖范围的示意图,感知设备201可以看作是路侧设备或者是路侧设备中一个模块(或者感知设备201可以与路侧设备相连)。感知设备201的覆盖范围如图所示,其中,覆盖范围内的不同区域对应不同的覆盖能力。如图2所示,以正确率来描述覆盖能力为例,覆盖范围内的不同的区域对应了不同的感知结果的正确率。一般来说,随着与感知设备201之间的距离越来越远,感知结果的正确率越来越低,也即是说,覆盖能力逐渐减弱。
路侧设备的通信设备可以支持路侧设备与其他设备(例如车辆、云端、或者其他路侧设备)进行通信。通信设备可以接收外部发送的数据和/或向外部发送数据。例如,通信设备可以包含以太网电缆等有线链路相关的模块或接口,或者可以包含无线链路(Wi-Fi、蓝牙、通用无线传输、车载短距通信技术等)等技术相关的模块或接口,或者既包括有线链路相关的模块或接口还包括无线链路技术相关的模块或接口。当路侧设备包括通信设备时(例如当路侧设备包括路侧单元(road side unit,RSU)或者为路侧单元时),路侧设备通过通信设备能与周围的车辆、其它路侧设备、或者服务器、或终端等设备通信,终端可以为电子设备,例如为手机、便携电脑,智能穿戴设备等。能够与路侧设备中的通信设备进行通信的区域,可以看作是该路侧设备的通信覆盖范围。
在一种可能设计中,以上路侧设备的通信设备可以包括射频部分和基带部分,射频部分包含天线和射频电路,路侧设备通过天线发射的无线信号能够到达的区域可以看作是路侧设 备的通信覆盖范围。或者,路侧设备通过天线能够接收到信号的区域可以看作是路侧设备的通信覆盖范围。
示例性地,请参见图3,图3是本申请实施例提供的一种可能的通信覆盖范围的示意图,通信设备301可以看作是路侧设备或者是路侧设备中一个模块(或者通信设备301可以与路侧设备相连)。通信设备301的覆盖范围如图所示,其中,覆盖范围内的不同区域对应的不同的覆盖能力。以正确率来描述覆盖能力为例,不同的区域对应不同的通信结果的正确率。一般来说,随着与通信设备301之间的距离越来越远,通信过程中数据传输的结果的正确率越来越低。也即是说,通信能力逐渐减弱。
本申请实施例中,将路侧设备的覆盖信息提供给车辆,可以使得车辆对路侧设备提供的环境信息依据覆盖信息进行选择或者处理,提高车辆使用的环境信息的准确性,提升车辆的行驶安全性。
需要说明的是,路侧设备可以是一个独立的设备,也可以集成在其他设备中。示例性地,路侧装置可以集成在智能加油站、充电桩、智能信号灯、路灯、电线杆或者交通指示牌等设备中。另外,前述的路边(或者说路口、路侧)可以是室外的道路,包括各类主干道、辅路、高架或者临时道路等等道路,还可以是室内的道路,例如室内停车场中的道路。
以上图2和图3中的覆盖范围用于举例,并非用于限制覆盖范围的形式,不同的感知设备可以具有不同形式(例如形状,范围)的覆盖范围,不同的通信设备可以具有不同形式(例如形状,范围)的覆盖范围。
(2)本申请实施例中涉及的车辆是通过动力驱动进行运动的装置,通常包括各种子系统,例如但是不限于行进系统、传感器系统、控制系统、一个或多个外围设备、电源以及用户接口等等。可选地,车辆还可以包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,车辆的每个子系统和元件可以通过有线或者无线互连。
需要说明的是,本申请实施例中的车辆可以是汽车、电动车,也可以是轨道运行的车辆,还可以是智能车辆(例如无人驾驶车辆)、智能移动机器人等等。其中,智能车辆支持通过车载传感系统感知道路环境,自动规划行车路线并控制车辆到达预定目标位置。智能汽车集中运用了计算机、现代传感、信息融合、通讯、人工智能机或自动控制等技术,是一个集环境感知、规划决策、多等级辅助驾驶等功能于一体的高新技术综合体。示例性地,智能车辆具体可以为拥有辅助驾驶系统或者全自动驾驶系统的汽车、轮式移动机器人等。
(3)服务端可以通过服务器、移动终端、主机、虚拟机或机器人等装置实现。当服务端为服务器时,可以包含一个服务器,也可以包含多个服务器组成的服务器集群。在一些场景中,服务端还可以为云端,云端可以包括云端服务器和/或云端虚拟机。可选的,云端可以部署在公有云、私有云或者混合云上。
上述三者中任意两者之间的通信链路可以包括一种或者多种类型的连接介质,包括有线链路(例如光纤)、无线链路或者有线链路和无线链路的组合等。例如连接介质可以为无线链路,该无线链路采用近距离连接技术,例如802.11b/g技术、蓝牙(Blue Tooth)技术、紫蜂(Zigbee)技术、无线射频识别(Radio Frequency Identification,RFID)技术、超宽带(Ultra Wideband,UWB)技术、无线短距通信(例如车载无线短距通信)技术或车联网(vehicle to everything,V2X,车对外界的信息交换)技术等。再如,该无线链路采用远距离连接技术,例如全球移动通信系统(Global System for Mobile communications,GSM)、通用分组无线业务(General Packet Radio Service,GPRS)、通用移动通信系统(Universal Mobile Telecommunications System,UMTS),LTE,或5G等无线接入类型技术。
一种设计中,服务端可以与车辆进行通信,以为车辆提供多种服务,例如高精地图服务、自动驾驶或辅助驾驶服务等。又一种可能的设计中,车辆可以与服务端进行交互,使用云端提供的多种服务,例如可以通过高精地图服务提升自动驾驶或辅助驾驶功能,从而提升车辆的行驶安全性和出行效率。
又一种可能的设计中,车辆可以从服务端下载高精地图数据来获得高精地图,为使用者提供更加准确的导航服务。通过云端提供的高精地图服务,可以让车辆在行驶时实时地获取高精地图,提升车辆的自动驾驶决策的安全性。由于环境的变化是动态的,因此,高精地图除静态图层外,越来越需要更多的动态信息,以满足交通领域的发展需求。
而路侧设备是高精地图的动态信息的重要来源。当前,路侧设备可以用于提供行车道路上的环境信息,如红绿灯信息、障碍物信息等。但是,路侧设备的覆盖范围有限,对于不同的覆盖范围内,路侧设备所提供的服务(例如感知结果、通信结果等)存在一定的可靠性评估。通过路侧设备的覆盖范围进行设计,将路侧设备的覆盖范围提供给云端作为高精地图的信息,可以提升高精地图的服务质量。或者,将覆盖范围提供给车辆用于确定驾驶策略,可以提升车辆驾驶决策的可靠性。
请参阅图4,图4是本申请实施例提供的一种数据处理方法的流程示意图。可选的,图4所示的数据处理方法可适用于上述图1中所示的场景。该数据处理方法至少可以包括步骤S401和步骤S402,具体如下:
步骤S401:获取路侧设备的覆盖信息。该覆盖信息包括覆盖区域信息和覆盖能力信息,其中覆盖区域信息用于指示路侧设备的至少一个覆盖区域,覆盖能力信息用于指示路侧设备在该至少一个覆盖区域内的覆盖能力。
具体地,该步骤由数据处理装置执行,数据处理装置可以位于服务端、路侧设备或车辆。
路侧设备可以设置于路边(或者说路口、路侧)。路侧设备具有一定的覆盖范围,该覆盖范围表征了路侧设备能够提供的服务的区域。进一步地,覆盖范围可以包括感知覆盖范围、通信覆盖范围等范围中的至少一种。关于路侧装置、覆盖范围、通信覆盖范围、感知覆盖范围等描述可以参考前述对图1的相关说明,此处不再赘述。
进一步的,覆盖区域可以分为感知覆盖区域、通信覆盖区域等不同的覆盖区域类型。覆盖区域可以通过几何形状指示,也可以通过坐标指示,还可以通过相对位置来指示。下面例举三种可能的设计:
设计1:将路侧设备的位置作为参考位置,通过覆盖区域的端点关于路侧设备的相对位置,可以指示覆盖区域。以通过正确率描述覆盖能力为例,请参见图5A,图5A是本申请实施例提供的一种可能的指示覆盖区域的方法示意图,将路侧设备(或者路侧设备关联的感知设备)501的位置作为参考点,通过点A、点B、点C、点D关于路侧设备501的相对位置,可以指示正确率≥90%的覆盖区域。类似地,通过路侧设备501的位置以及点A、点B、点E、点F关于路侧设备501的相对位置,可以确定正确率≥75%的覆盖区域。其他覆盖区域可以以此类推。
应理解,多个覆盖区域还可以使用不重叠的方式描述。如图5A所示,90%≥正确率≥75%的覆盖区域可以通过路侧设备501的位置以及点C、点D、点E、点F关于路侧设备501的相对位置确定。
设计2:通过覆盖区域的端点相对于参考点O的位置,可以指示覆盖区域。请参见图5B,图5B是本申请实施例提供的又一种可能的指示覆盖区域的方法示意图,通过点I、点J、点 K、点L相对于参考点O的位置,可以确定正确率≥90%的覆盖区域。类似地,通过点I、点J、点M、点N相对于参考点O的位置,可以确定正确率≥75%的覆盖区域。其他覆盖区域可以以此类推。
类似的,多个覆盖区域还可以使用不重叠的方式描述。如图5B所示,90%≥正确率≥75%的覆盖区域可以通过点K、点L、点M、点N关于参考点O的位置确定。
设计3:通过几何形状来指示覆盖区域。请参见图3,以通信覆盖区域的形状为圆形为例,通过通信设备301(或通信设备所在的路侧设备)的位置以及与半径长度,可以指示覆盖区域。如,在半径15米(m)内正确率为98%,则覆盖区域为圆心为通信设备301(或者预先配置的圆心点)、半径为15m的圆形。该通信覆盖区域的形状不做限制,例如还可以为扇形区域。
可选的,还可以使用经度、维度等等方式来描述端点,进而指示覆盖区域。或者可选的,对于立体的覆盖区域,还可以通过六自由度位姿等等来指示覆盖区域,此处不再一一说明。
又一种可能的设计中,覆盖区域还可以经过裁剪、拼接、取交集、取并集等等加工处理。例如,覆盖区域可以包括道路的区间段或车道的区间段。示例性地,请参见图6,图6是本申请实施例提供的又一种可能的覆盖区域的指示方法的示意图。可以看出,路侧设备601的覆盖区域为路侧设备601的实际覆盖区域与道路的区间段的交集部分。应理解,对于覆盖区域包含车道的区间段的情况,同样适用。类似的,一些实施情况中,覆盖区域也可以把距离道路边缘线在某一范围内的区域包含在内。或者一些实施情况中,覆盖区域还可以包含人行道、辅道等等区间段。
前述的覆盖能力具体为路侧设备在覆盖区域内的覆盖能力,覆盖能力可以通过覆盖能力信息来描述。其中,感知覆盖能力为路侧设备在感知覆盖区域内的覆盖能力,通信覆盖能力为路侧设备在通信覆盖区域内的覆盖能力。
覆盖能力信息具体可以指示不同的指标,或者称为内容。在一种可能的设计,覆盖能力为路侧设备在通信覆盖区域内的覆盖能力,覆盖能力信息可以用于指示以下内容(或者说指标)中的至少一项:数据正确率、丢包率、通信时延、通信稳定性或信号强度等等中的至少一项。一种设计中,前述内容也可以称为基础指标。
在又一种可能的设计,覆盖能力为路侧设备在感知覆盖区域内的覆盖能力,覆盖能力信息用于指示以下内容(或者说指标)中的至少一项:感知结果正确率、误检率、漏检率、召回率、感知精度、感知平均精度(Average Precision,AP)、检测稳定性或检测位置精度等等中的至少一项。可选的,前述内容也可以称为基础指标。示例性地,感知结果正确率用于指示检出的正确结果占检出的结果的比率;误检率是指检出的错误结果占检出的结果的比率;漏检率是指没有被检出的结果占检出的结果的比率;召回率(Recall,也可以称为查全率)用于指正确检出的结果占所有结果(或者说所有应检出的结果)的比率;感知精度、感知平均精度可以用于评估正确率和/或召回率;检测稳定性用于指示各项检测指标随时间恒定的能力;检测位置精度用于描述感知结果的位置与真实位置之间的对应能力。
应理解,上述各项内容的说明仅为参考,在不同的应用场景中上述指标也可以有其他的解释。以召回率为例,一种可能的场景中,召回率与样本的真实结果和检测结果有关。具体的,样本的真实结果与检测结果的关系可以有以下几类:真正类(True Positive,TP)、假正类(True negative,TN),真负类(False positive,FP)、假负类FN(False negative,FN)。True和False用于判断结果的正确与否,Positive和Negative用于判断正类还是负类。由此可知样本总数=TP+FP+TN+FN。例如,召回率可以满足如下式子:Recall=TP/(TP+ FN),准确率(Precision)可以满足如下式子:Precision=TP/(TP+FP)。
覆盖信息中包括的覆盖区域的数量可以是一个或者多个。相应的,覆盖能力信息也可以是一个或者多个。进一步的,由于覆盖范围的种类不同,覆盖区域也可以相应的有感知覆盖区域、通信覆盖区域等不同的覆盖区域类型。覆盖信息中可以包含一种类型的覆盖区域,也可以包含多种类型的覆盖区域,例如既包含通信覆盖区域也包含感知覆盖区域。
在一种可能的设计中,覆盖信息指示的至少一个覆盖区域可以包括M个通信覆盖区域和N个感知覆盖区域,所述M和所述N为自然数,以及所述M和所述N不同时为0。下面例举几种可能的情况:
情况1:N>0。覆盖信息中包括至少两个感知覆盖区域的覆盖区域信息。进一步的,还可以包括该至少两个感知覆盖区域对应的覆盖能力信息。示例性地,请参见表1,表1是本申请实施例提供的一种可能的覆盖信息,该覆盖信息可以用于描述如图2所示的感知覆盖范围。以正确率来描述覆盖能力为例,不同的区域对应不同的通信结果的正确率,其中,覆盖区域1内,对应的通信结果正确率≥98%。
表1覆盖信息
覆盖能力及对应的覆盖区域
{覆盖能力:感知结果正确率≥90%;覆盖区域:覆盖区域1}
{覆盖能力:感知结果正确率≥75%;覆盖区域:覆盖区域2}
{覆盖能力:感知结果正确率≥60%;覆盖区域:覆盖区域3}
……
可选的,包含多个感知覆盖区域时,覆盖能力可以通过感知能力来体现,则多个感知覆盖区域可以是按照感知能力的等级划分的。示例性地,感知能力可以分为多个等级,如第一级别、第二级别、第三级别等。其中,覆盖区域1对应第一级别,覆盖区域2对应第二级别,覆盖区域3对应第三级别。
可选的,本申请实施例中的感知能力的等级可以是根据感知能力的强弱(例如正确率的大小等)确定的,还可以是预先定义、预先配置或者根据协议规定得到的。
情况2:M>0。覆盖信息中包括至少两个通信覆盖区域的覆盖区域信息。进一步的,还可以包括该至少两个通信覆盖区域对应的覆盖能力信息。示例性地,请参见表2,表2是本申请实施例提供的一种可能的覆盖信息,该覆盖信息可以用于描述如图3所示的通信覆盖范围。以正确率来描述覆盖能力为例,不同的区域对应不同的通信结果的正确率,其中,覆盖区域1内,对应的通信的数据正确率≥98%。
表2覆盖信息
覆盖能力及对应的覆盖区域
{覆盖能力:数据正确率≥98%;覆盖区域:覆盖区域4}
{覆盖能力:数据正确率≥95%;覆盖区域:覆盖区域5}
……
可选的,包含多个通信覆盖区域时,覆盖能力可以通过通信能力来体现,则多个通信覆盖区域可以是按照通信能力的等级划分的。示例性地,通信能力可以分为多个等级,如第一级别、第二级别、第三级别等。其中,覆盖区域4对应第一级别,覆盖区域5对应第二级别。
进一步可选的,通信能力的等级也可以与其它指标相关。例如,第一级别为数据正确率≥98%且通信时延<50ms的覆盖区域;第二级别为数据正确率≥95%且通信时延<70ms的覆盖区域 等。
应理解,上述情况1和情况2可以同时存在。上述仅以覆盖能力指示正确率为例进行描述,对于覆盖能力指示了其他内容的情况本申请同样适用。
在一种可能的设计中,覆盖信息也可以不包含覆盖能力信息。示例性地,如表3所示,覆盖信息可以通过数据编号进行索引,其中数据编号所对应的数据内容可以预先配置、通过协议定义或者通过协商得到。示例性的,根据协议规定,数据编号1001的内容对应正确率>98%的通信覆盖区域(即覆盖区域4),数据编号1001的内容对应正确率>95%的通信覆盖区域(即覆盖区域5),其他内容以此类推。通过预先规定覆盖信息的格式,可以减少冗余的覆盖能力信息,提高数据传输的效率。应理解,此处仅为示例性的使用数据编号进行索引,具体实施方式中数据编号也可以替换为数据标识、比特位置等等,此处不再赘述。
表3覆盖信息
数据编号 数据内容
…… ……
1001 {覆盖区域:覆盖区域1}
1002 {覆盖区域:覆盖区域2}
…… ……
数据处理装置获取路侧设备的覆盖信息,具体可以包含以下几种实现方式:
实现方式1:数据处理装置接收其他装置发送的覆盖信息。下面例举两种可能的示例:
示例1:数据处理装置可以接收路侧装置发送的覆盖信息。例如,请参见图7,图7是本申请实施例可能适用的场景示意图,数据处理装置可以包含在车辆702(或云端703)中,可以接收来自路侧设备701发送的覆盖信息。
示例2:数据处理装置可以接收服务器端(或者说云端)发送的覆盖信息。例如,数据处理装置可以包含在车辆702,车辆702可以从云端703(例如地图云、或辅助驾驶云端等)获取路侧设备的覆盖信息。再如,数据处理装置也可以包含在路侧设备701中,路侧设备701可以接收车辆702(或云端703)发送的覆盖信息,该覆盖信息可以包含路侧设备701的覆盖信息,可选的,还包含其他路侧设备(图7中未示出)的覆盖信息。
实现方式2:数据处理装置生成路侧设备的覆盖信息。下面分别介绍针对路侧设备感知能力和通信能力的覆盖信息。
(1)针对路侧设备的感知能力,路侧设备的覆盖区域为感知区域,用于指示路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息为感知能力信息。
图17示出本申请实施例提供的感知能力信息生成方法的流程图。如图17所示,所述方法包括:
步骤S1701,获取路侧感知结果和多源融合感知结果。
步骤S1702,将所述路侧感知结果与所述多源融合感知结果进行匹配,得到多个目标位置点的匹配结果。
步骤S1703,基于所述匹配结果,生成所述第一路侧设备的第一感知能力信息。
其中,第一路侧设备表示待确定感知能力的路侧设备。第一路侧设备可以为任意一个路侧设备。第一感知能力信息可以表示第一路侧设备的感知能力。第一感知能力信息可以用于指示第一路侧设备的感知能力,例如第一路侧设备能够感知的区域和不能感知的区域。基于路侧感知结果和多源融合感知结果的匹配结果,可以生成第一感知能力信息。
路侧感知结果可以用于指示在预设时间段内第一路侧设备感知的交通参与者的第一组位置点。其中,第一组位置点可以为第一路侧设备内的一个传感器感知到的交通参与者的位置点;也可以为对第一路侧设备内的多个传感器感知到的交通参与者的多组位置点在第一路侧设备内进行融合后得到的一组位置点。
多源融合感知结果可以用于指示在预设时间段内多个感知设备获取的所述交通参与者的多组位置点进行融合后得到的第二组位置点。其中,多个感知设备可以为相同的感知设备,也可以为不同的感知设备;且可以位于不同的载体上,且不同的载体类型可以相同或不同,例如位于路侧设备、车辆或者便携终端(也可以称为移动终端)中的至少一种,即可以位于多个路侧设备,或位于多个车辆,或位于多个便携终端,或位于路侧设备、车辆和便携终端这三种设备中的两种设备或者三种设备。
预设时间段可以表示任意一个时间段,例如,预设时间段可以以月、周,天或小时为单位,例如,预设时间段位1个月、1周或者1天等,预设时间段可以根据需要进行设置,对此本申请不做限制。可以理解的是,预设时间段的时长较长,使用的交通参与者的位置点较多时,得到的第一感知能力信息的准确性较高。
路侧感知结果和多源融合感知结果是对同一时间段内、同一路侧设备周边的交通参与者的感知结果。其中,路侧感知结果反映的是第一路侧设备在预设时间段内实际感知到的交通参与者。而多源融合感知结果使用的数据来自多个感知设备,反映的是多个感知设备在预设时间段内实际感知到的交通参与者。由于这些感知设备之间互相弥补了视角和缺点,因此,多源融合感知结果的置信度较高,可以作为路侧感知结果的参照标准,以确定路侧感知结果是否准确,从而确定第一路侧设备的感知能力。可以理解的是,第一路侧设备较好的感知到了多源融合感知结果指示的交通参与者,表明这些交通参与者处于第一路侧设备的感知范围内,第一路侧设备没有感知到多源融合感知结果指示的交通参与者,表明这些交通参与者已经超出了第一路侧设备的感知范围。举例来说,路边绿化带有行人横跨经过,该行人没有通过移动终端上报自身的位置信息,且因为被绿植部分遮挡,一些角度的车辆未识别到,但其他角度的车辆识别到了,因此多源融合感知结果中存在这个行人。若路侧感知结果中也存在这个行人,则表明这个行人在第一路侧设备的感知范围内,若路侧感知结果中不存在这个行人,则表明这个行人不在第一路侧设备的感知范围内。因此,通过将第一路侧设备的路侧感知结果和多源融合感知结果进行匹配,可以便捷、准确地确定第一路侧设备的感知范围。
下面对获取路侧感知结果和多源融合感知结果的过程进行说明。考虑到上述方法可以由云端服务器执行,也可以由第一路侧设备执行,这里分别结合图18A至图18C所示的系统结构示意图对获取路侧感知结果和多源融合感知结果的过程进行说明。
图18A示出本申请实施例提供的通信系统的结构示意图。如图18A所示,该通信系统包括云端服务器11、第一路侧设备12、车端设备13、移动终端14和第二路侧设备15。第一路侧设备12可以表示任意一个路侧设备。第二路侧设备15可以表示除第一路侧设备12以外的与云端服务器11建立通信连接的路侧设备。第二路侧设备15可以有一个或多个。第二路侧设备15可以与第一路侧设备12建立了通信连接,也可以不与第一路侧设备12建立通信连接。为了便于理解,在本申请实施例中,将第二路侧设备15中与第一路侧设备12建立了通信连接的路侧设备称为第三路侧设备。
如图18A所示,第一路侧设备12、第二路侧设备15、车端设备13和移动终端14分别与云端服务器11建立了通信连接。车端设备13与移动终端14还分别与第一路侧设备12建立了通信连接。在一个示例中,第一路侧设备12、第二路侧设备15、车端设备13和移动终 端14分别与云端服务器11可以分别通过蜂窝网(例如,3G、4G或者5G等)与云端服务器11建立通信连接。移动终端14与第一路侧设备12之间同样可以通过蜂窝网建立通信连接。车端设备13与第一路侧设备12之间可以通过专用短距离通信(dedicated Short Range Communication,DSRC)技术等车联网(Vehicle to X,V2X)技术建立通信连接。具体的,车端设备13与第一路侧设备12可以通过车载单元(On Board Unit,OBU)以及路侧设备的通信设备建立通信连接。第一路侧设备12与第二路侧设备15之间同样可以通过V2X技术建立通信连接。
如图18A所示,移动终端14可以通过终端定位装置获取终端位置数据,之后可以通过V2X网络将终端位置数据上报至第一路侧设备12,以及通过蜂窝网将终端位置数据上报至云端服务器11。车端设备13可以通过车辆定位装置获取车辆位置数据,并通过车辆传感装置获取车辆感知数据。之后,车端设备13可以通过V2X网络将车辆位置数据和车辆感知数据上报至第一路侧设备12,以及通过蜂窝网将车辆位置数据和车辆感知数据上报至云端服务器11。第一路侧设备12可以通过路侧传感装置获取路侧感知数据,通过移动终端14获取终端位置数据,以及通过车端设备13获取车辆位置数据以及车辆感知数据。其中,终端位置数据、车辆位置数据以及车辆感知数据可以称为第一路侧设备12的路侧收集数据。可选的,若第二路侧设备15中存在与第一路侧设备12建立了通信连接的第三路侧设备,第三路侧设备可以将其收集的路侧收集数据发送至第一路侧设备12,此时,第一路侧设备12的路侧收集数据中还包括了第三路侧设备的路侧收集数据。这样,在第三路侧设备与云端服务器11之间的通信连接出现故障的情况下,第三路侧设备的路侧收集数据仍然能够上报至云端服务器,从而提高了通信系统的可靠性。之后,第一路侧设备12可以通过蜂窝网将路侧感知数据和路侧收集数据上报至云端服务器。同理,第二路侧设备15也可以通过蜂窝网将路侧感知数据和路侧收集数据上报至云端服务器。第二路侧设备15获取路侧感知数据和路侧收集数据的方式可以参照第一路侧设备12获取路侧感知数据和路侧收集数据的方式,这里不再赘述。
可见,云端服务器11接收到的数据包括:来自第一路侧设备12的路侧感知数据,来自第一路侧设备12的路侧收集数据、来自第二路侧设备15的路侧感知数据,来自第二路侧设备15的路侧收集数据、来自车端设备13的车辆位置数据和车辆感知数据,以及来自移动终端14的终端位置数据。
之后,云端服务器11可以根据来自第一路侧设备12的路侧感知数据获取路侧感知结果,根据上述接收到的数据获取第一路侧设备对应的多源融合感知结果。在一个示例中,云端服务器11可以从来自第一路侧设备12的路侧感知数据中筛选出预设时间段内的路侧感知数据,得到第一路侧设备的路侧感知结果;从接收到的数据中筛选出预设时间段内且处于预选范围内的数据,并将筛选出的数据进行融合,得到第一路侧设备的多源融合感知结果。其中,预选范围为第一路侧设备周边的区域,预选范围可以根据第一路侧设备的感知范围出厂指标以及第一路侧设备的安装方向确定,例如可以根据第一路侧设备的感知范围出厂指标的基础上,在安装方向上预留一定裕度(例如,扩大3米、5米等),得到预选范围。筛选出预设时间段内且处于预选范围内的数据进行融合,可以减少进行融合以及匹配的数据量,从而降低运算量,提升效率。可以理解的是,在获取多源融合感知结果的过程中,涉及侧感知设备越多,涉及的交通参与者越多,或者预设时间段的时长越长,得到的多源融合感知结果就越准确。
在获取了路侧感知结果和多源融合感知结果之后,云端服务器11可以将路侧感知结果与多源融合感知结果进行匹配,得到多个目标位置点的匹配结果,并基于匹配结果,生成第一路侧设备的第一感知能力信息。之后,如图18A所示,云端服务器11可以将第一感知能力 信息下发至第一路侧设备12、车端设备13、移动终端14和第二路侧设备15等。而第一路侧设备12接收到第一感知能力信息后,又可以将第一感知能力信息转发至车端设备13、移动终端14和第二路侧设备15中的第三路侧设备。对于将路侧感知结果与多源融合感知结果进行匹配,得到多个目标位置点的匹配结果,并基于匹配结果,生成第一路侧设备的第一感知能力信息的过程会在本申请实施例后续部分进行详细说明。
图18B示出本申请实施例提供的通信系统的结构示意图。图18B所示的通信系统包括的设备以及设备之间的连接关系可以参照图18A所示的通信系统,这里不再赘述。图18B中,云端服务器11接收数据的过程可以参照图18A中的云端服务器11接收数据的过程,这里不再赘述。
在图18B中,云端服务器11接收的数据包括:来自第一路侧设备12的路侧感知数据,来自第一路侧设备12的路侧收集数据、来自第二路侧设备15的路侧感知数据,来自第二路侧设备15的路侧收集数据、来自车端设备13的车辆位置数据和车辆感知数据,以及来自移动终端14的终端位置数据。云端服务器11可以根据上述接收到的数据获取第一路侧设备对应的多源融合感知结果。之后,云端服务器11可以将第一路侧设备对应的多源融合感知结果发送至第一路侧设备12。第一路侧设备12可以根据自身的路侧感知数据获取路侧感知结果。
在获取了路侧感知结果和多源融合感知结果之后,第一路侧设备12可以将路侧感知结果和多源融合感知结果进行匹配,得到多个目标位置点的匹配结果,并基于匹配结果,生成第一路侧设备的第一感知能力信息。之后,如图18B所示,第一路侧设备12可以将第一感知能力信息发送至车端设备13、移动终端14和第二路侧设备15中的第三路侧设备。对于将路侧感知结果与多源融合感知结果进行匹配,得到多个目标位置点的匹配结果,并基于匹配结果,生成第一路侧设备的第一感知能力信息的过程会在本申请实施例后续部分进行详细说明。
图18C示出本申请实施例提供的通信系统的结构示意图。如图18C所示,该通信系统可以包括第一路侧设备12、车端设备13、移动终端14和第三路侧设备16。车端设备13、移动终端14和第三路侧设备16分别与第一路侧设备12建立了通信连接。
如图18C所示,车端设备13向第一路侧设备12上报了车辆位置数据和车辆感知数据,移动终端14向第一路侧设备12上报了终端位置数据,第三路侧设备16向第一路侧设备12发送了第三路侧设备的路侧感知数据和路侧收集数据。至此,第一路侧设备12中获取的数据包括:来自车端设备13的车辆位置数据和车辆感知数据,来自移动终端14的终端位置数据,来自第三路侧设备16的路侧感知数据和路侧收集数据,以及自身的路侧感知数据。之后,第一路侧设备12可以根据自身的路侧感知数据获取路侧感知结果,根据上述获取的数据获取多源融合感知结果。其中,第一路侧设备12获取路侧感知结果的方式以及获取多源融合感知结果的方式可以参照图18A中的云端服务器11获取路侧感知结果的方式以及获取多源融合感知结果的方式,这里不再赘述。
在获取了路侧感知结果和多源融合感知结果之后,第一路侧设备12可以将路侧感知结果和多源融合感知结果进行匹配,得到多个目标位置点的匹配结果,并基于匹配结果,生成第一路侧设备的第一感知能力信息。之后,如图18B所示,第一路侧设备12可以将第一感知能力信息发送至车端设备13、移动终端14和第二路侧设备15中的第三路侧设备。对于将路侧感知结果与多源融合感知结果进行匹配,得到多个目标位置点的匹配结果,并基于匹配结果,生成第一路侧设备的第一感知能力信息的过程会在本申请实施例后续部分进行详细说明。
第一路侧设备在预设时间段内可以感知到一个或多个交通参与者,每个感知到的交通参与者对应一组位置点,称为第一组位置点。也就是说,路侧感知结果可以指示在预设时间段 内第一路侧设备感知的一个或多个交通参与者中每个交通参与者的第一组位置点。具体的,路侧感知结果可以包括其所指示的第一组位置点中每个位置点的时间信息、位置信息、运动参数和属性信息中的至少一项。
在预设时间段内,同一个交通参与者的位置变化信息可能被多个感知设备获取到。例如,在预设时间段内,车辆1的位置变化信息可以被自身的车端设备获取到、被周边的路侧设备感知到,以及被周边的其他车辆的车端设备感知到。针对一个交通参与者:在预设时间段内每个获取到该交通参与者的位置变化信息的感知设备可以获取到该交通参与者的一组位置点;所有感知到该交通参与者的位置变化信息的感知设备获取到的各组位置点进行融合后,可以得到该交通参与者对应的一组位置点,称为第二组位置点。举例来说,在本申请实施例中可以采用卡尔曼滤波、多贝叶斯估计法、模糊逻辑推理或者人工神经网络等对多个感知设备获取的数据进行融合。
可见,一个交通参与者的第一组位置点是第一路侧设备感知到的一组位置点,一个交通参与者的第二组位置点是多个感知设备获取的多组位置点融合得到的一组位置点。
在一种可能的实现方式中,路侧感知结果和多源融合感知结果指示的位置点(包括第一组位置点和第二组位置点)是离散的位置点。路侧感知结果包括第一组位置点中每个位置点的时间信息、位置信息、运动参数和属性信息中的至少一项。多源融合感知结果包括第二组位置点中每个位置点的时间信息、位置信息、运动参数和属性信息中的至少一项。将路侧感知结果与多源融合感知结果进行匹配包括:将第一组位置点和第二组位置点进行逐点匹配。这里,逐点匹配,无需时序关系,降低了获取路侧感知结果和多源融合感知结果的难度。
在一种可能的实现方式中,路侧感知结果和多源融合感知结果指示的位置点(包括第一组位置点和第二组位置点)为轨迹中的位置点。图19A示出本申请实施例中第一组位置点及对应轨迹的示意图。图19B示出本申请实施例中第二组位置点以及对应轨迹的示意图。路侧感知结果包括第一组位置点各位置点之间的时序关系,以及第一组位置点中每个位置点的时间信息、位置信息、运动参数和属性信息中的至少一项。多源融合感知结果包括第二组位置点中各位置点之间的时序关系,以及第二组位置点中每个位置点的时间信息、位置信息、运动参数和属性信息中的至少一项。将路侧感知结果与多源融合感知结果进行匹配包括:将路侧感知结果和多源融合感知结果进行轨迹匹配。举例来说,轨迹匹配的算法可以包括但不限于匈牙利算法(Hungarian Algorithm)和K均值(K-means)算法等。本申请实施例,对轨迹匹配时采用的算法不做限制。这里,轨迹匹配结合了时序关系,可以提高匹配结果的精度和置信度。
路侧感知结果与多源融合感知结果进行匹配后,可以得到多个目标位置点的匹配结果。这里,一个目标位置点为第一组位置点中的位置点或者第二组位置点中的位置点。一个目标位置点的匹配结果为真正例(True Positive,TP)、假反例(False Negative,FN)和假正例(False Positive,FP)中的一者。
一个目标位置点的匹配结果为TP表示:该目标位置点为第二组位置点中的位置点,且第一组位置点中存在与该目标位置点匹配的位置点。一个目标位置点的匹配结果为FN表示:该目标位置点为第二组位置点中的位置点,且第一组位置点中不存在与该目标位置点匹配的位置点。一个目标位置点的匹配结果为FP表示:该目标位置点为第一组位置点中的位置点,且第二组位置点中不存在与该目标位置点匹配的位置点。
图19C示出本申请实施例中匹配结果的示意图。如图19C所示,k1、k2、k3为路侧感知结果对应的轨迹,k1、k2、k3上的位置点为第一组位置点中的位置点;h1、h2和h3为多源 融合感知结果对应的轨迹,h1、h2和h3上的位置点为第二组位置点中的位置点。经过轨迹匹配发现,h1与k1相匹配,h2和k2相匹配,不存在与h3相匹配的轨迹,也不存在与k3相匹配的轨迹。对于h1和h2上的位置点而言,其属于第二组位置点,且第一组位置点中存在与其相匹配的位置点,因此,h1和h2上的位置点为目标位置点且匹配结果为TP。对于h3上的位置点而言,其属于第二组位置点且第一组位置点中不存在与其匹配的位置点,因此,h3上的位置点为目标位置点且匹配结果为FN。对于k3而言,其属于第一组位置点且第二组位置点中不存在与其匹配的位置点,因此k3上的位置点为目标位置点且匹配结果为FP。
图19D示出本申请实施例中轨迹匹配的示意图。如图19D所示,k4、k5和k6为路侧感知结果对应的轨迹,k4、k5和k6上的位置点为第一组位置点中的位置点;h4、h5和h6为多源融合感知结果对应的轨迹,h4、h5和h6上的位置点为第二组位置点中的位置点。不同交通参与者的轨迹之间可能出现交汇的情况,如图19D所示的k4与k5出现交汇,k4与k6出现交汇。若将k4在t至t+3的部分与k5在t+3至t+7的部分误判为一条轨迹,则会将k4与k5的组合轨迹与h4相匹配,从而将h4上的位置点误判为匹配结果为TP的目标位置点。若将k4在t+5至t+7的部分与K6在t至t+5的部分误判为一条轨迹,则会将k4与k6的组合轨迹与h6向匹配,从而将h6上的位置点误判为匹配结果为TP的目标位置点。在本申请实施例中,路侧感知结果与多源融合感知结果中包括几何形状大小、颜色等属性信息,可以在不同交通参与者的轨迹交汇时,降低轨迹误判的可能性,从而提高目标位置点的准确性和置信度。
在一种可能的实现方式中,对于匹配结果为TP的目标位置点,可以与指标信息进行关联,以指示该目标位置点的状态。在一个示例中,指标信息可以包括运动指标误差、形状大小误差、目标跟踪稳定性和位置点正确匹配率中的一者或多个。其中,运动指标误差包括位置误差和/或速度误差。举例来说,位置误差可以为dx/dy,其中,dx表示目标位置点与其匹配的第一位置点在水平方向或者经度上的差值,dy表示目标位置点与其匹配的第一位置点在竖直方向上或者纬度的差值。速度误差可以为速度差值、速度比值、加速度差值和加速度比值中的一项或多项。形状大小误差可以为大小的差值或者大小的比值。目标跟踪稳定性表示了估算的位置点与采集的位置点的偏差,可以反映一组位置点的可靠性,目标跟踪稳定性较高则这组位置点的可靠性较高,目标跟踪稳定性较低则这组位置点的可靠性较低。确定目标跟踪稳定性的过程中,可以采用卡尔曼滤波、隐马尔科夫模型或者均值漂移等方法估算位置点。位置点正确匹配率表示第二组位置点中匹配结果为TP的位置点的数量与第二组位置点中位置点的总数量之比。可以理解的是,对于同一第二组位置点中的目标位置点,其关联的跟踪稳定性是相同,其关联的位置点正确匹配率也是相同的。可以理解的是,以上仅为指标信息的示例性说明,匹配结果为TP的目标位置点还可以关联其他指标信息。
至此,获取了多个目标位置点,以及每个目标位置点的匹配结果。下面对基于匹配结果,生成第一路侧设备的第一感知能力信息的过程进行说明。
在一种可能的实现方式中,基于匹配结果,生成第一路侧设备的第一感知能力信息可以包括:基于第一路侧设备的预选范围,确定出多个网格;合并所述多个网格中网格指标满足第一条件的网格,得到合并后的网格,并继续合并存在的网格中网格指标满足所述第一条件的网格,直至不存在满足所述第一条件的网格;针对任意一个网格,将所述网格确定为一个感知区域,并基于所述网格的网格指标所属的指标范围,确定所述网格的感知能力级别;根据各网格的位置信息和感知能力级别,确定所述第一路侧设备的感知能力信息。
第一路侧设备的预选范围可以为第一路侧设备周边的区域,第一路侧设备的预选范围可 以根据第一路侧设备的感知范围出厂指标以及第一路侧设备的安装方向确定。在一个示例中,第一路侧设备的预选范围要大于第一路侧设备的感知范围出厂指标在安装方向上指示的范围。
在一种可能的实现方式中,基于第一路侧设备的预选范围,确定出多个网格可以包括:对第一路侧设备的预选范围进行网格化处理得到多个网格。
在另一种可能的实现方式中,基于第一路侧设备的预选范围,确定出多个网格可以包括:取第一路侧设备的预选范围与第一道路的交集,得到待划分区域;对待划分区域进行网格化处理,得到多个网格。其中,第一道路可以表示第一路侧设备所在的道路或者第一路侧设备所感知的道路,第一道路与第一路侧设备的关联关系可以在部署第一路侧设备时预先设置。
图20A示出本申请实施例中待划分区域的示例性示意图。如图20A所示,待划分区域未超出第一道路的道路边缘线。这样,既可以不会减少感知到的交通参与者的数量,又为后续网格划分与融合提供了便利。图20B示出本申请实施例中网格的示例性示意图。如图20B所示,待划分区域可以划分为多个网格。在一个示例中,待划分区域均匀的划分成多个网格,这样,便于统计管理。当然,还可以采用其他方式将待划分区域划分成多个网格,例如,在距离第一路侧设备较近的区域划分出的网格的面积小于距离第一路侧设备较远的区域划分出的网格的面积。
在划分完网格之后,可以确定每个网格的网格指标。在一个示例中,针对任意一个网格,可以根据处于该网格的目标位置点的指标信息,确定该网格的网格指标。在一个示例中,所述网格指标包括检出指标、运动指标和跟踪指标中的一者或多者,其中,所述检出指标包括准确率和/或召回率,所述运动指标包括速度和/或加速度,所述跟踪指标包括位置点正确匹配率和/或目标跟踪稳定性。
在确定各个网格的网格指标之后,可以将合并所述多个网格中网格指标满足第一条件的网格,得到合并后的网格。其中,所述第一条件包括以下条件中的一者或多个:检出指标的差距小于第一阈值;运动指标的差距小于第二阈值;跟踪指标的差距小于第三阈值。第一阈值、第二阈值和第三阈值可以根据需要进行设置,例如,第一阈值可以为90%等,第二阈值可以为1m/s等,第三阈值可以为95%等。本申请实施例对第一阈值、第二阈值和第三阈值不做限制。
之后,确定经过上一轮合并之后得到的各个网格的网格指标,并继续合并存在的网格中网格指标满足所述第一条件的网格,直至不存在满足所述第一条件的网格。图20C示出了本申请实施例中网格的合并结果图。如图20C所示,划分出的网格经合并得到三个区域,分别为区域1、区域2和区域3。参见图20C,区域1中匹配结果为FN的目标位置点的比例较大,匹配结果为FP的目标位置点的比例很小,匹配结果为TP的目标位置点的比例极小(甚至为0),可见第一路侧设备未能感知到区域1中存在的交通参与者,第一路侧设备在区域1中不具备感知能力;区域2中匹配结果为TP的目标位置点的比例较小,匹配结果为FN和FP的位置点的比例较大,可见,第一路侧设备能感知到区域2中的部分交通参与者,第一路侧设备在区域2中具有感知能力,但感知能力较差;区域3中匹配结果为TP的目标位置点的比例较大,匹配结果为FN和FP的目标位置点的比例很小,可见第一路侧设备在区域3中具有感知能力,且感知能力较强。
在不存在满足第一条件的网格的情况下,即网格无法继续合并的情况下,针对任意一个网格,将所述网格确定为一个感知区域,并基于所述感知区域的网格指标所属的指标范围,确定所述感知区域的感知能力级别;根据各感知区域的位置信息和感知能力级别,确定所述第一路侧设备的感知能力信息。
在本申请实施例中,每个指标范围对应一个感知能力级别,基于感知区域的网格指标所属的指标范围,确定所述感知区域的感知能力级别包括:在所述感知区域的网格指标属于第一指标范围的情况下,确定所述感知区域的感知能力级别为第一感知能力级别。其中,第一指标范围为各指标范围中的任意一个,第一感知能力级别为第一指标范围对应的感知能力级别。以图20C为例,假设有三个感知区域,分别为:区域1、区域2和区域3,其中,区域1的网格指标属于指标范围1,区域2的网格指标属于指标范围2,区域3的指标属于指标范围3,则可以第一路侧设备在区域1的感知能力级别为级别1,在区域2的感知能力级别为级别2,在区域3的感知能力级别为级别3。
在一个示例中,感知区域的网格指标属于第一指标范围可以包括:检出指标在第一范围内,和/或运动指标在第二范围内,和/或跟踪指标在第三范围内。其中,第一范围、第二范围和第三范围可以根据需要进行设置,本申请实施例不做限制。
在一个示例中,感知能力级别可以包括:盲区、感知能力较弱、感知能力一般和感知能力较强。在又一示例中,感知能力级别可以包括:低级、中级和高级。在另一示例中,感知能力级别可以包括:第一级、第二级、第三级和第四级等。可以理解的是,以上仅为感知能力级别的示例性说明,本申请实施例对感知能力级别的划分方式和划分数量不做限制。
在一种可能的实现方式中,第一感知能力信息可以用于指示第一路侧设备的感知能力。举例来说,第一感知能力信息可以指示第一路侧设备能够感知的区域和不能感知的区域。例如第一路侧设备能够感知200米以内的区域,不能感知200米以外的区域。
在一种可能的实现方式中,第一感知能力信息可以用于指示第一区域和第一路侧设备在第一区域内的感知能力。
其中,第一区域可以表示任意一个区域。在一个示例中,第一区域可以为第一道路上的一个区域。第一区域可以为矩形、扇形或者多边形等。本申请实施例对第一区域的形状和面积不做限制。举例来说,第一路侧设备感知100米以内的区域效果较好,即感知能力为强感知;感知100米到150米的效果一般,即感知能力为中等感知;感知150米到200米的区域效果较差,即感知能力为弱感知;感知不到200米以外的区域,即感知能力为无感知。
在一种可能的实现方式中,第一感知能力信息可以用于指示第一场景、第一区域和第一路侧设备在第一场景下以及第一区域内的感知能力。
本发明实施例中的“场景”用于标识具有感知功能的设备工作所处的环境,或者标识具有感知功能的设备所感知的目标所处的环境。其中,第一场景可以表示任意一种场景。举例来说,第一场景包括但不限于白天、夜间、晴天、阴天、风沙、雨雪、雾天等影响感知能力的场景。可以理解的是,第一路侧设备在白天的感知范围要大于夜间的感知范围,晴天的感知范围要大于阴天、风沙、雨雪和雾天的感知范围。风沙大小不同、雨雪强度不同或者雾的级别不同,第一路侧设备的感知范围也不同。因此,在本申请实施例中,可以分场景描述第一路侧设备的感知能力,从而使得第一路侧设备的感知能力的准确性更高。举例来说,在晴天的场景下,第一路侧设备在图20C所示的区域2的感知能力为中等感知,在图20C所示的区域3的感知能力为强感知;在雾天场景下,第一路侧设备在图20C所示的区域2的感知能力为弱感知,在图20C所示的区域3的感知能力为中等感知。
需要说明的是,在第一感知能力信息用于指示第一场景、第一区域和第一路侧设备在第一场景下以及第一区域内的感知能力时,可以在前述路侧感知数据、车辆感知数据、车辆位置数据和终端位置数据中添加场景标签,这样可以获取到第一场景下的路侧感知结果和多源融合感知结果。若上述路侧感知数据、车辆感知数据、车辆位置数据和终端位置数据中未添 加场景标签,则在获取第一场景下的路侧感知结果和多源融合感知结果之前,可以结合第三方信息(例如结合时间信息和历史天气信息),得到第一场景下的路侧感知数据、车辆感知数据、车辆位置数据和终端位置数据。
至此,获得了第一路侧设备的第一感知能力信息。在本申请实施例中,针对任意一个第二路侧设备,该第二路侧设备的第二感知能力信息可以参照第一路侧设备的第一感知能力信息,获取该第二路侧设备的第二感知能力信息的方式,可以参照获取第一路侧设备的感知能力信息的方式,这里不再赘述。
在一种可能的实现方式中,第一路侧设备的第一感知能力信息可以与道路的标识进行关联。这样,在规划路线或者交通参与者计划进入一条路或者一段路之前,可以调出一条路或者一段路上各个路侧设备的感知能力信息,从而确定一条路或者一段路上各个区域的路侧感知效果,有利于提高安全性。
下面对感知能力信息的应用进行说明。
考虑到在道路上因遮挡等原因可能导致多个路侧设备下仍存在盲区,本申请实施例中可以综合各个路侧设备的感知能力信息形成整体的感知覆盖能力。在一种可能的实现方式中,所述方法还包括:生成多个路侧设备的多个感知能力信息;根据所述多个感知能力信息,生成感知盲区信息。
其中,多个感知能力信息用于指示所述多个路侧设备的感知能力。具体的,多个路侧设备包括第一路侧设备,则多个感知能力信息包括第一感知能力信息。另外,多个路侧设备还可以包括一个或多个第二路侧设备,则多个感知能力信息包括一个或多个第二感知能力信息。
所述感知盲区信息用于指示所述多个路侧设备中的一个或多个路侧设备未覆盖的区域。在一个示例中,所述多个路侧设备中的一个或多个路侧设备未覆盖的区域包括:绝对盲区和/或相对盲区。其中,所述多个路侧设备中每个路侧设备在所述绝对盲区内均不能达到感知能力标准,所述多个路侧设备中的部分路侧设备在所述相对盲区内不能达到所述感知能力标准。
其中,感知能力标准可以根据需要进行设置,本申请对感知能力标准不做限制。在一个示例中,达到感知能力标准包括但不限于:符合预置感知能力级别(例如,对应感知能力级别为一级或者二级),或者处于预置指标范围(例如,检出指标落在预置指标范围内,和/或运动指标落在预置指标范围内,和/或跟踪指标落在预置指标范围内)等。在一路侧设备在一区域未达到感知能力标准的情况下,表明该路侧设备在该区域的感知效果较差,在该区域感知到的信息的置信度较低,因此,该区域为该路侧设备的盲区。图21示出本申请实施例的感知盲区的示例性示意图。图21示出了路侧设备1的感知盲区与非感知盲区的分界线,以及路侧设备2的感知盲区与非感知盲区的分界线。在分界线以内的区域为非感知盲区,分界线以外的区域为感知盲区。路侧设备1的感知盲区与路侧设备2的非感知盲区的交集,以及路侧设备1的非感知盲区和路侧设备2的感知盲区的交集,为相对感知盲区。路侧设备1的感知盲区与路侧设备2的感知盲区的交集为绝对感知盲区。
以图21所示的路侧设备1和路侧设备2为例,对确定相对感知盲区和绝对感知盲区的过程进行说明。
在路侧设备1和路侧设备2之间建立了通信连接的情况下,对一个区域的感知能力,以路侧设备1和路侧设备2中最好的感知能力为准。对于一个区域,若路侧设备1的感知能力与路侧设备2的感知能力均未达到感知能力标准,则可以确定该区域为绝对感知盲区。这种情况下,可以不标记相对感知盲区。
在路侧设备1和路侧设备2之间未建立通信连接的情况下,将路侧设备1的感知能力未 达到感知能力标准但路侧设备2的感知能力能够达到感知能力标准的区域,以及路侧设备2的感知能力未达到感知能力标准的区域但路侧设备1的感知能力能够达到感知能力标准的区域,确定为相对感知盲区;将两者的感知能力均未达到感知能力标准的区域确定为绝对感知盲区。
在一个示例中,可以为绝对感知盲区和相对感知盲区添加不同的标识。例如,为绝对感知盲区添加第一标识,为相对感知盲区添加第二标识。这样,根据标识即可确定一个感知盲区是绝对感知盲区还是相对感知盲区。可选的,在标识相对感知盲区时,还可以将相对感知盲区与路侧设备的标识相关联,以明确一个相对感知盲区是哪个路侧设备的感知盲区。
在又一示例中,可以将一个路侧设备的感知能力信息与该路侧设备建立了通信连接的路侧设备建立连接。这样,用户可以自行判断出路侧设备与哪些路侧设备建立了通信连接,从而确定出哪里是绝对感知盲区哪里是相对感知盲区。
在一种可能的实现方式中,所述方法还包括:根据所述第一感知能力信息生成预警提示信息。其中,预警提示信息用于提示在第二区域内由驾驶员接管车辆,对第一路侧设备进行故障检测、降低第一路侧设备感知得到的关于第二区域的信息的置信度或者在路径规划时避开第二区域。
其中,第一感知能力信息指示第一路侧设备在第二区域内的感知能力低于感知阈值。感知阈值可以根据需要进行设置。在一个示例中,低于感知阈值可以包括但不限于:未达到阈值感知能力级别(例如,未达到一级感知能力级别或者未达到二级感知能力级别等)、检出指标未达到预置的检出指标阈值、运动指标未达到预置的运动指标阈值以及跟踪指标未达到预设的跟踪指标阈值中的一者或多者。这里的检出指标阈值、运动指标阈值和跟踪指标阈值可以根据需要进行设置,本申请实施例不做限制。考虑到感知能力标准用于判定感知盲区,感知阈值用于进行预警,在非感知盲区但是感知效果较差的区域就需要进行预警,因此,在一个示例中,感知阈值可以大于(高于)或者等于感知能力标准。
由于在第二区域内第一路侧设备的感知能力低于感知阈值,代表第一路侧设备在第二区域内的感知效果较差,第一路侧设备无法较准确且全面的感知到第二区域内的交通参与者。因此,在第二区域内车辆进行自动驾驶的风险较高,驾驶员可以在第二区域内接管车辆。同时,可以对第一路侧设备进行故障检测,查看是否因为第一路侧设备发生了故障而造成了第一路侧设备在第二区域内的感知效果较差,特别是在第二区域距离第一路侧设备较近的情况下。另外,由于第一路侧设备在第二区域内的感知效果较差,第一路侧设备感知得到的关于第二区域的信息的准确性也就相对较低,可以降低第一路侧设备感知得到的关于第二区域的信息的置信度。在一个示例中,第一路侧设备感知得到的关于第二区域的信息包括:第二区域内的交通参与者的位置点以及各位置点的时间信息、位置信息、运动参数和属性信息等中的一者或多者。由于第一路侧设备在第二区域内的感知效果较差,因此在路径规划时可以避开第二区域,这样可以降低车辆进入第二区域后发生事故的可能性,特别是对于自动驾驶的车辆,避开第二区域行驶,就不需要驾驶员接管车辆,可以有效提升用户体验。
在一种可能的场景中,路侧设备可以向数据处理装置上报关于覆盖范围的参数。相应的,数据处理装置根据一个或者多个路侧设备上报的覆盖范围的参数,生成路侧设备的覆盖信息。其中,路侧设备上报的关于覆盖范围的参数可以是路侧设备中预先配置的、预先定义或预先设计的,也可以是通过实际检测得到的。可选的,关于覆盖范围的参数中或者覆盖信息中可以包含用于指示覆盖能力的来源的信息(例如:预先设计、实测、预估等等中的一项)。
可选的,路侧设备可以包含一个或者多个感知设备,或者可以与一个或多个感知设备连 接。路侧设备的感知能力具体可以通过感知设备实现。进一步可选的,感知设备可以进行组合,一个或者多个感知设备可以组成感知设备组。例如,相机和激光雷达可以作为融合感知设备组,进行图像和激光探测结合的融合感知。
进一步可选的,在路侧设备包含多个感知设备(或者与多个感知设备连接)的情况下,覆盖信息中的感知覆盖区域可以对应感知设备或者感知设备组。示例性的,当路侧设备的覆盖信息包含多个感知覆盖区域的情况下,多个感知覆盖区域中的每个感知覆盖区域可以对应一个感知设备,或者每个感知覆盖区域对应一个感知设备组,或者多个感知覆盖区域中部分感知覆盖区域对应感知设备、部分感知区域对应感知设备组。
在一种可能的设计中,对应感知设备组的感知覆盖区域和感知覆盖区域内的覆盖能力是根据感知设备组内的感知设备的覆盖能力确定的。例如,感知设备组的覆盖能力可以是根据多个感知设备的感知能力融合得到的。进一步的,将融合后的覆盖能力,按照等级划分区域,从而可以得到对应该感知设备组的覆盖区域。
对于感知设备组对应的感知覆盖区域,可以称为多设备感知覆盖区域,则多设备感知覆盖区域和路侧设备在该多设备感知覆盖区域内的覆盖能力是根据感知设备组中的多个感知设备的覆盖能力确定的。该多个感知设备与路侧设备相关,是指该多个感知设备中每个感知设备与该路侧设备相关;感知设备与路侧设备相关,是指该感知设备向该路侧设备发送该感知设备感知到的信息,物理实现上,包括但不限于该感知设备设置于该路侧设备中,或者设置于该路侧设备之外且通过无限或者有线的方式与该路侧设备相连。
(2)针对路侧设备的通信能力,路侧设备的覆盖区域为通信区域,用于指示路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息为通信能力信息。
图22示出本申请实施例提供的通信能力信息生成方法的流程图。如图22所示,所述方法可以包括:
步骤S2201,获取第一通信状态指示信息。
步骤S2202,根据所述第一通信状态指示信息,确定所述多个位置点在所述第一路侧设备周围的第一分布情况。
步骤S2203,根据所述第一分布情况,生成所述第一路侧设备的第一通信能力信息。
其中,第一路侧设备表示待确定通信能力的路侧设备。第一路侧设备可以为任意一个路侧设备。第一通信能力信息可以表示第一路侧设备的通信能力信息。第一通信能力信息可以用于指示第一路侧设备的通信能力,例如第一路侧设备能够通信的区域和第一路侧设备不能通信的区域。
第一通信状态指示信息可以用于指示多个终端设备在多个位置点与第一路侧设备建立通信连接。在一个终端设备在一个位置点与第一路侧设备建立了通信连接的情况下,表明该位置点位于第一路侧设备的通信范围内,第一路侧设备的通信能力能够达到该位置点。因此,基于与第一路侧设备建立通信连接的多个终端设备的多个位置点的分布情况,可以确定出第一路侧设备的通信能力能够达到的区域,从而便捷、准确地获得第一路侧设备的通信范围。
可以理解的是,第一通信状态指示信息指示的多个终端设备在多个位置点可以包括:不同的终端设备在同一时刻的位置点、同一个终端设备在不同时刻的位置点,以及不同的终端设备在不同时刻的位置点。例如,多个终端设备在多个位置点可以包括:车辆1在周一上午1点的位置点1和车辆2在周一上午1点的位置点2,车辆1在周一上午1点的位置点1和车辆1在周一下午1点的位置点3,以及车辆3在周二上午1点的位置点4和车辆4在周二下 午1点的位置点5。也就是说,本申请实施例对第一通信状态指示信息指示的多个位置点是否是同一个终端设备的位置点以及是否为同一时刻采集到的位置点不做限制。
在一种可能的实现方式中,第一通信状态指示信息可以包括:所指示的多个位置点的位置信息,所指示的多个终端设备中多个通信模块的工作状态信息、所指示的多个终端设备与第一路侧设备的连接状态信息、第一路侧设备的标识信息,以及时间信息。其中,位置点的位置信息和通信模块的工作状态信息以及时间信息如上文所述,这里不再赘述。
一个终端设备与一个路侧设备的连接状态信息可以为已连接状态或者未连接状态。已连接状态表明该终端设备已经与该路侧设备建立了通信连接,未连接状态表明该终端设备未与该路侧设备建立通信连接。由于第一通信状态指示信息指示的是多个终端设备在多个位置点与第一路侧设备建立通信连接,因此,第一通信状态指示信息中与第一路侧设备的连接状态信息为已连接状态。
路侧设备的标识信息可以用于识别不同的路侧设备。举例来说,路侧设备的标识信息可以为路侧设备的名称、编号、位置信息、其上配置的通信模块的标识,或者其他用户自定义的标识等。因此,第一路侧额设备的标识信息可以为第一路侧设备的名称、编号、第一路侧设备的RSU_ID或者其他用户为第一路侧设备自定义的标识等。
下面对获取第一通信状态指示信息的过程进行说明。
图23示出本申请实施例提供的通信系统的结构示意图。如图23所示,该通信系统包括第一路侧设备11和第一终端设备12。其中,第一路侧设备11可以表示任意一个路侧设备,第一终端设备12表示与第一路侧设备11建立通信连接的终端设备。第一终端设备12包括且不限于车端设备和移动终端等设备。第一路侧设备11可以连接一个或多个第一终端设备12。具体的,第一路侧设备11可以通过第一终端设备12中的通信模块与第一终端设备12建立通信连接。第一终端设备12获取自身的交通参与者数据后,可以将获取的交通参与者数据上报至第一路侧设备11。
在一种可能的实现方式中,一个终端设备的交通参与者数据可以包括采集该交通参与者数据时该终端设备所在位置点的位置信息,采集该交通参与者数据的时间信息,该终端设备中通信模块的工作状态信息,以及该终端设备连接的路侧设备的标识信息。在一个示例中,位置信息可以记为Position,工作状态信息可以记为Connection,路侧设备的标识信息可以记为RSU_ID,时间信息可以记为Time,则一个终端设备的交通参与者数据可以记为(Position,Device,Connection,RSU_ID,Time)。由于第一终端设备12为与第一路侧设备11建立通信连接的终端设备,因此,第一终端设备12的交通参与者数据中,通信模块的工作状态信息为“正常工作状态”,路侧设备的标识信息中包括“第一路侧设备11的标识信息”。第一路侧设备11接收到各第一终端设备12上报的交通参与者数据后,可以基于接收到的信息,生成第一通信状态指示信息。
需要说明的是,参照图23可知,与第一路侧设备11建立通信连接的第一终端设备12可以直接将交通参与者数据上报至第一路侧设备11,其他未与第一路侧设备11建立通信连接的终端上设备是无法直接将其交通参与者数据上报至第一路侧设备11的(这里不考虑通过其他路侧设备转发的情况,即使第一路侧设备接收到了其他路侧设备转发的交通参与者数据,基于其中的路侧设备的标识信息还是可以筛选出与第一路侧设备12建立了通信连接的交通参与者数据的)。因此,第一路侧设备11收集的交通参与者数据都是来自与第一路侧设备11建立通信连接的第一终端设备12的。
在获取到第一通信状态指示信息之后,第一路侧设备可以执行步骤S2202得到第一分布 情况。在一个示例中,第一路侧设备可以根据第一位置点(即第一通信状态指示信息指示的第一终端设备的位置点)的位置信息,确定第一分布情况。
图24示出了第一分布情况的示例性示意图。如图24所示,第一终端设备(即与第一路侧设备建立通信连接的终端设备)在多个位置点与第一路侧设备建立了通信连接,这些位置点的位置信息即为第一分布情况。参见图24可知,在靠近第一路侧设备的区域,能够与第一路侧设备建立通信连接的位置点较多,在远离第一路侧设备的区域,能够与第一路侧设备建立通信连接的位置点较少。
之后,第一路侧设备11可以执行步骤S2203,得到第一通信能力信息。
图25示出本申请实施例提供的通信系统的结构示意图。如图25所示,该通信系统包括:第一路侧设备11、第二终端设备13和服务器14。其中,第一路侧设备11可以为任意一个路侧设备。第二终端设备13可以表示与服务器14建立通信连接的终端设备。第一路侧设备11和第二终端设备13均可以通过蜂窝网与服务器14建立通信连接。第二终端设备13获取自身的交通参与者数据之后,可以将获取的交通参与者数据上报至服务器14。考虑到第二终端设备13中可能包括与第一路侧设备11建立通信连接的第一终端设备12,即部分第二终端设备13可能既与服务器14建立了通信连接,又与第一路侧设备11建立了通信连接。因此,服务器14接收到各第二终端设备13上报的交通参与者数据之后,可以基于各交通参与者数据中的工作状态信息和路侧设备的标识信息,筛选出与第一路侧设备建立了通信连接的交通参与者数据。具体的,服务器14可以从接收到的交通参与者数据中筛选出通信模块的工作状态信息为“正常工作状态”,路侧设备的标识信息中包括“第一路侧设备11的标识信息”的交通参与者数据,并基于筛选出的交通参与者数据,生成第一通信状态指示信息。
在一种可能的实现方式中,服务器14生成第一通信状态指示信息之后,可以执行步骤S2202得到第一分布情况,或者将第一通信状态指示信息发送至第一路侧设备11,由第一路侧设备11执行步骤S2202得到第一分布情况。
在一种可能的实现方式中,在服务器14生成第一通信状态指示信息的过程中,服务器14在筛选交通参与者数据的过程中,可以先在收集到的交通参与者数据中找到在第一路侧设备的预选范围内的交通参与者数据,然后从预选范围内的交通参与者数据中筛选出工作状态信息为“正常工作状态”的交通参与者数据,为了便于描述,在本申请实施例中,将此时筛选出的交通参与者数据组成的数据集称为数据集A。之后,服务器14可以从数据集A中筛选出路侧设备的标识信息中包括“第一路侧设备11的标识信息”的交通参与者数据,在本申请实施例中,此时筛选出的交通参与者数据组成的数据集称为数据集B。数据集A中除数据集B中的交通参与者数据以外的交通参与者数据组成的数据集称为数据集C。其中,预选范围为第一路侧设备11周围的区域,预选范围可以根据第一路侧设备11的通信范围出厂指标以及第一路侧设备的安装方向确定,例如可以根据第一路侧设备的通信范围出厂指标的基础上,在安装方向上预留一定裕度(例如,扩大3米、5米等),得到预选范围。
图26示出终端设备的分布情况示意图。如图26所示,在预选范围内,示出了多个终端设备的位置点,在某些位置点上,终端设备能够与第一路侧设备建立通信连接,在某些位置点上,终端设备无法与第一路侧设备建立通信连接。这些能够与第一路侧设备建立通信连接的位置点对应的交通参与者数据在数据集B中,这些未与第一路侧设备建立通信连接的位置点对应的交通参与者数据在数据集C中。图26所示的数据集B中的位置点的位置信息即为第一分布情况。参照图26可知,在靠近第一路侧设备的区域,能够与第一路侧设备建立通信连接的位置点较多,在远离第一路侧设备的区域,能够与第一路侧设备建立通信连接的位置 点较少。
之后,服务器14或者第一路侧设备11可以步骤S2203,得到第一通信能力信息。
下面对步骤S2203中根据第一分布情况,生成第一通信能力信息的过程进行说明。参照图23和图25可知,步骤S2203可以由第一路侧设备执行也可以由服务器执行,下面以第一路侧设备执行步骤S2203为例进行说明,由服务器执行步骤S2203的过程可以参照由第一路侧设备执行步骤S2203的过程,本申请实施例中不再赘述。
在一种可能的实现方式中,步骤S2203可以包括:第一路侧设备根据第一分布情况,直接生成第一通信能力信息。其中,第一分布情况可以为第一位置点的密度,第一位置点表示第一终端设备的位置点。在第一位置点的密度较大的区域,第一路侧设备的通信能力较强,第一位置点的密度较小的区域,第一路侧设备的通信能力较弱,因此,第一路侧设备可以根据第一位置点的密度生成第一通信能力信息。
在一种可能的实现方式中,第一路侧设备可以获取第二通信状态指示信息,根据第二状态指示信息,确定第二分布情况,然后在步骤S2203中根据第一分布情况和第二分布情况,生成第一通信能力信息。
其中,第二通信状态指示信息用于指示至少一个终端设备(为了便于描述,在本申请实施例中,将至少一个终端设备称为至少一个第三终端设备)在至少一个位置点(为了便于描述,在本申请实施例中,将至少一个位置点称为至少一个第三位置点)与第二路侧设备建立通信连接,且至少一个第三位置点距离第一路侧设备的距离小于预设阈值。第二通信状态指示信息的获取过程可以参照第一通信状态指示信息的获取过程,将第一通信状态指示信息获取过程中的第一路侧设备替换为第二路侧设备,并将交通参与者信息中的位置信息限制在距离第一路侧设备的距离小于预设阈值的范围内即可。其中,预设阈值可以根据需要进行设置,距离来说,预设阈值可以为100米、200米、500米或者1000米等。在一个示例中,第一路侧设备可以根据第三位置点(即第二通信状态指示信息指示的位置点)的位置信息,确定第二分布情况。第二分布情况可以参照图26中数据集B的位置点的位置信息加上数据集C中的位置点的位置信息。
一个终端设备在距离第一路侧设备的距离小于预设阈值的一个位置点上,与第二路侧设备建立了通信连接,表明该终端设备的通信模块的工作状态信息为“正常工作状态”,且该终端设备处于第一路侧设备周围。此时,该终端设备即为上述第三终端设备,该位置点即为上述第三位置点。一个第三终端设备在一个第三位置点上可能与第一路侧设备建立了通信连接(例如,图26所示的数据集B中的位置点),也可能未与第一路侧设备建立了通信连接(例如,图26所示的数据集C中的位置点)。在本申请实施例中,可以将第二分布情况作为第一分布情况的比较对象,以第二分布情况反应第一路侧设备周围实际存在的能够与第一路侧设备建立通信连接的位置点,以第一分布情况反应第一路侧设备实际建立了通信连接的位置点。在本申请实施例中,可以基于第一分布情况和第二分布情况可以确定稳定连接率。其中,稳定连接率可以为第一位置点的数量与第三位置点的数量的比值。可以理解的是,在稳定连接率较大时,表明第一路侧设备实际建立了通信连接的位置点的数量与第一路侧设备周围实际存在的能够与第一路侧设备建立通信连接的位置点的数量较为接近,第一路侧设备的通信能力较好。在稳定连接率较小时,表明第一路侧设备实际建立了通信连接的位置点的数量与第一路侧设备周围实际存在的能够与第一路侧设备建立通信连接的位置点的数量差距较大,第一路侧设备的通信能力较差。因此,第一路侧设备可以根据稳定连接率生成第一通信能力信息。
在一种可能的实现方式中,第一路侧设备可以获取第三通信状态指示信息,然后根据第三通信状态指示信息,确定第三分布情况,然后在步骤S2203中根据第一分布情况和第三分布情况,生成第一通信能力信息。
其中,第三通信状态指示信息用于指示至少一个终端设备(为了便于描述,在本申请实施例中,将至少一个终端设备称为至少一个第二终端设备)在至少一个位置点(为了便于描述,在本申请实施例中,将至少一个位置点称为至少一个第二位置点)与服务器建立通信连接,且至少一个第二终端设备具备连接第一路侧设备的能力,至少一个第二位置点距离第一路侧设备的距离小于预设阈值。第三通信状态指示信息的获取过程可以如图25,由服务器从接收到的交通参与者信息中进行筛选获得。具体的,服务器可从接收到的交通参与者信息中筛选出位置信息在距离第一路侧设备的距离小于预设阈值且通信模块的工作状态信息为“正常工作状态”的交通参与者数据,然后基于筛选出的交通参与者数据获得第三通信状态指示信息。在一个示例中,第一路侧设备可以根据第二位置点(即第三通信状态指示信息指示的位置点)的位置信息,确定第三分布情况。第三分布情况可以参照图26中数据集B的位置点的位置信息加上数据集C中的位置点的位置信息。
一个终端设备在距离第一路侧设备的距离小于预设阈值的一个位置点上与服务器建立了通信连接,且该终端设备的通信模块的工作状态信息为“正常工作状态”,表明该终端设备在第一路侧设备附近且具有连接第一路侧设备的能力。此时,该终端设备即为上述第二终端设备,该位置点即为上述第二位置点。若第一路侧设备未与第三终端设备建立通信连接,则表明第一路侧设备在对应第二位置点的通信能力较差;若第一路侧设备与第三终端设备建立了通信连接,则表明第一路侧设备在对应第三位置点的通信能力较强。因此,在本申请实施例中,可以将第三分布情况作为第一分布情况的比较对象,以第三分布情况反应第一路侧设备周围实际存在的能够与第一路侧设备建立通信连接的位置点,以第一分布情况反应第一路侧设备实际建立了通信连接的位置点。在本申请实施例中,可以基于第一分布情况和第三分布情况可以确定稳定连接率。其中,稳点连接率可以为第一位置点的数量与第二位置点的数量的比值。可以理解的是,在稳定连接率较大时,表明第一路侧设备实际建立了通信连接的位置点的数量与第一路侧设备周围实际存在的能够与第一路侧设备建立通信连接的位置点的数量较为接近,第一路侧设备的通信能力较好。在稳定连接率较小时,表明第一路侧设备实际建立了通信连接的位置点的数量与第一路侧设备周围实际存在的能够与第一路侧设备建立通信连接的位置点的数量差距较大,第一路侧设备的通信能力较差。因此,第一路侧设备可以根据稳定连接率生成第一通信能力信息。
在一种可能的实现方式中,步骤S2203可以包括:基于第一路侧设备的预选范围,确定出多个网格;合并所述多个网格中网格指标满足第一条件的网格,得到合并后的网格,并继续合并存在的网格中网格指标满足第一条件的网格,直至不存在满足第一条件的网格;针对任意一个网格,将所述网格确定为一个通信区域,并基于所述网格的网格指标所属的指标范围,确定所述网格的通信能力级别;根据各网格的位置信息和通信能力级别,确定第一通信能力信息。
在一个示例中,基于第一路侧设备的预选范围,确定出多个网格可以包括:对第一路侧设备的预选范围进行网格化处理得到多个网格。在又一示例中,基于第一路侧设备的预选范围,确定出多个网格可以包括:取第一路侧设备的预选范围与第一道路的交集,得到待划分区域;对待划分区域进行网格化处理,得到多个网格。其中,第一道路可以表示第一路侧设备所在道路或者第一路侧设备周围的道路,第一道路与第一路侧设备的关联关系可以在部署 第一路侧设备时预先设置。
其中,网格指标为:网格中第一位置点的密度或者稳定连接率,相应的第一条件为:密度差值小于第一阈值或者稳定连接率差值小于第二阈值。其中,第一阈值和第二阈值可以根据需要进行设置,例如,第一阈值可以为0.2个/㎡等,第二阈值可以为0.1等。本申请实施例对第一阈值和第二阈值不做限制。
图27示出本申请实施例中网格的示例性示意图。如图27所示,基于图24所示的第一分布情况,第一路侧设备的预选范围划分为多个网格。在一个示例中,待划分区域均匀的划分成多个网格(如图27所示),这样,便于统计管理。当然,还可以采用其他方式将待划分区域划分成多个网格,例如,在距离第一路侧设备较近的区域划分出的网格面积小于距离第一路侧设备较远的区域划分出的网格的面积(未示出)。这样,可以减少计算次数和合并次数。
在划分完网格之后,可以将每个网格的第一位置点的密度确定为各个网格的网格指标。在确定各个网格的网格指标之后,可以将多个网格中网格指标满足第一条件的网格合并,得到合并后的网格。
之后,确定经过上一轮合并之后得到的各个网格的网格指标,并继续合并存在的网格中网格指标满足所述第一条件的网格,直至不存在满足第一条件的网格。图28示出本申请实施例中网格的合并结果的示例性示意图。如图28所示,图27所示的网格最终合并得到了区域1和区域2。其中,区域1中第一位置点的密度较小,区域2中第一位置点的密度较大,可见,第一路侧设备在区域1中具有通信能力,但通信能力较差,在区域2中具有通信能力,且通信能力较强。
图29示出本申请实施例中网格的示例性示意图。如图29所示,基于图26所示的终端设备的分布情况,在第一路侧设备的预选范围划分为多个网格。在划分完网格之后,可以将每个网格的稳定连接率确定为各个网格的网格指标。在确定各个网格的网格指标之后,可以将多个网格中网格指标满足第一条件的网格合并,得到合并后的网格。之后,确定经过上一轮合并之后得到的各个网格的网格指标,并继续合并存在的网格中网格指标满足所述第一条件的网格,直至不存在满足第一条件的网格。图30示出本申请实施例中网格的合并结果的示例性示意图。如图30所示,图29所示的网格最终合并得到了区域1和区域2。其中,区域1的稳定连接率较小,区域2的稳定连接率较大,可见第一路侧设备在区域1中具有通信能力,但通信能力较差,在区域2中具有通信能力,且通信能力较强。
在不存在满足第一条件的网格的情况下,即网格无法继续合并的情况下,针对任意一个网格,将所述网格确定为一个通信区域,并基于所述通信区域的网格指标所属的指标范围,确定所述通信区域的通信能力级别;根据各通信区域的位置信息和通信能力级别,可以确定第一路侧设备的通信能力信息。
在本申请实施例中,每个指标范围对应一个通信能力级别,基于通信区域的网格指标所属的指标范围,确定通信区域的感知能力级别包括:在所述通信区域的网格指标属于第一指标范围的情况下,确定所述通信区域的感知能力界别为第一感知能力级别。其中,第一指标范围为各指标范围中的任意一个,第一通信能力级别为第一指标范围对应的通信能力级别。以图28和图30为例,有两个通信区域:区域1和区域2,其中,区域1的网格指标属于指标范围1,区域2的网格指标属于指标范围2,则可以确定第一路侧设备在区域1的通信能力级别为级别1,在区域2的通信能力级别为级别2。
在一个示例中,通信区域的网格指标属于第一指标范围可以包括:密度在第一范围内,和/或,稳定连接率在第二范围内。其中,第一范围、第二范围可以根据需要进行设置,本申 请实施例不做限制。
在一个示例中,通信能力级别可以包括:通信盲区、通信能力较弱、通信能力一般和通信能力较强。在又一示例中,通信能力级别可以包括:低级、中级和高级。在另一示例中,通信能力级别可以包括:第一级、第二级、第三级和第四级等。可以理解的是,以上仅为通信能力级别的示例性说明,本申请实施例对通信能力级别的划分方式和划分数量不做限制。
在一种可能的实现方式中,第一通信能力信息可以用于指示第一路侧设备的通信能力。举例来说,第一通信能力信息可以指示第一路侧设备能够通信的区域和不能通信的区域。例如,第一路侧设备可以与处于200米以内的区域的终端设备通信,无法与处于200米以外的区域的终端设备通信。
在一种可能的实现方式中,第一通信能力信息可以用于指示第一区域和第一路侧设备在第一区域内的通信能力。
其中,第一区域可以表示任意一个区域。在一个示例中,第一区域可以为第一道路上的第一区域。第一区域可以为矩形、扇形、椭圆形或者其他形状。本申请实施例对第一区域的形状和面积不做限制。举例来说,第一路侧设备在100米以内的区域通信效果较好,即通信能力为强通信能力;在100米到150米的通信效果一般,即通信能力为中等通信通能力;在150米到200区域的通信效果较差,即通信能力为弱通信能力;与200米以外的区域无法通信,即通信能力为无法通信。
在一种可能的实现方式中,第一通信能力信息可以用于指示第一场景、第一区域和第一路侧设备在第一场景下第一区域内的通信能力。
本申请实施例中的“场景”用于标识具有通信功能的设备所处的环境(例如,第一路侧设备所处的环境),或者标识具有通信功能的设备的通信对象所处的环境(例如,车辆或者行人所处的环境)。其中,第一场景可以表示任意一种场景。举例来说,第一场景包括但不限于白天、夜间、晴天、阴天、风沙、雨雪、雾天等影响感知能力的场景。可以理解的是,第一路侧设备在晴天的通信范围要大于阴天、风沙、雨雪和雾天的通信范围。风沙大小不同、雨雪强度不同或者雾的级别不同,第一路侧设备的通信范围也不同。白天车流量大通信范围可能较小,晚上车流量小通信范围可能较大。因此,在本申请实施例中,可以分场景描述第一路侧设备的通信能力,从而使得第一路侧设备的通信能力的准确性更高。举例来说,在晴天的场景下,第一路侧设备在图30所示的区域1的通信能力为中等通信,在图30所示的区域2的通信能力为强通信;在雾天场景下,第一路侧设备在图30所示的区域1的通信能力为弱通信,在图30所示的区域2的通信能力为中等通信。
需要说明的是,在第一通信能力信息指示第一场景、第一区域和第一路侧设备在第一场景下第一区域内的通信能力时,可以在前述交通参与者数据中添加场景标签,这样可以获取到第一场景下的第一通信状态指示信息、第二通信状态指示信息和第三通信状态指示信息。若上述交通参与者数据中未添加场景标签,则在获取第一场景下的交通参与者数据之前,可以结合第三方信息(例如结合时间信息和历史天气信息)得到第一场景下的交通参与者数据。
至此,获得了第一路侧设备的第一通信能力信息。在本申请实施例中,针对其他路侧设备获取通信能力信息的方式可以参照获取第一路侧设备的第一通信能力信息的方式,这里不再赘述。例如,获取第二路侧设备的第二通信能力信息的方式可以参照获取第一路侧设备的第一通信能力信息的方式。
在一种可能的实现方式中,第一路侧设备的第一通信能力信息可以与道路的标识进行关联。这样,在规划路线或者交通参与者计划计入一条路或者一段路之前,可以调出一条路或 者一段路上各个路侧设备的通信能力信息,从而确定一条路或者一段路上各个区域的路侧通信能力,有利于提高安全性。
在一种可能的实现方式中,第一通信能力信息可以存储为地图数据。这样,车辆在进行智能驾驶时,可以从地图中获取到第一通信能力信息,从而确定车辆在行驶到某个区域时是否需要驾驶员接管车辆,在某个区域是否需要降低来自第一路侧设备的信息的置信度,或者在规划路径时是否需要避开某个区域,从而提高安全性。可以理解的是,第一通信能力信息可以与第一路侧设备相关联后存储为地图数据。其他路侧设备的通信能力信息(例如第二路侧设备的第二通信能力信息)也可以存储为地图数据,以提高安全性。
下面对通信能力信息的应用进行说明。
考虑到道路上遮挡等原因可能导致多个路侧设备下仍存在通信盲区,本申请实施例中可以综合多个路侧设备的通信能力信息形成整体的通信覆盖能力。在一种可能的实现方式中,所述方法还包括:生成多个路侧设备的多个通信能力信息;根据所述多个通信能力信息,生成通信盲区信息。
其中,所述多个通信能力信息用于指示多个路侧设备的通信能力。具体的,多个路侧设备包括所述第一路侧设备,所述多个通信能力信息包括所述第一通信能力信息。另外,多个路侧设备还可以包括一个或多个第二路侧设备,则多个通信能力信息可以包括一个或多个第二通信能力信息。
通信盲区信息用于指示所述多个路侧设备中的一个或多个路侧设备未覆盖的区域。在一个示例中,所述多个路侧设备中的一个或多个路侧设备未覆盖的区域包括:绝对盲区和/或相对盲区,其中,所述多个路侧设备中任一路侧设备在所述绝对盲区内均不能达到阈值T1,所述多个路侧设备中的部分路侧设备在所述相对盲区中不能达到阈值T2。
其中,阈值T1和阈值T2可以根据需要进行设置,本申请实施例对阈值T1和阈值T2不做限制。阈值T1和阈值T2可以用于指示期望或者可接受的通信效果。在一个路侧设备不能达到阈值T1或者阈值T2时,表明路侧设备的通信效果未达到预期或者是不能被接受。在一个路侧设备能够达到阈值T1或者阈值T2时,表明路侧设备的通信效果可以达到预期或者可以接受。在一个示例中,阈值T1和阈值T2包括但不限于:符合预置通信能力级别(例如,对应通信能力级别为一级或者二级),或者处于预置指标范围(例如,密度落在预置指标范围内,稳定连接率落在预置指标范围内)等。在一个路侧设备在一个区域内未达到阈值T1的情况下,表明该路侧设备在该区域的通信效果较差,在该区域该路侧设备通信获得的信息可靠性、准确性较低(置信度较低、不够完整),因此,该区域为该路侧设备的盲区。在本申请实施例中,阈值T1和阈值T2可以相同也可以不同,对此不做限制。
图31示出本申请实施例的通信盲区的示例性示意图。图31中示出了路侧设备1的通信盲区和非通信盲区的分界线,以及路侧设备2的通信盲区和非通信盲区的分界线。在分界线以内的区域为非通信盲区,分界线以外的区域为通信盲区。路侧设备1的通信盲区与路侧设备2的非通信盲区的交集,以及路侧设备1的非通信盲区和路侧设备2的通信盲区的交集,为相对通信盲区。路侧设备1的通信盲区和路侧设备2的通信盲区的交集为绝对通信盲区。
以图31所示的路侧设备1和路侧设备2为例,假设阈值T1和阈值T2相同,对确定相对通信盲区和绝对通信盲区的过程进行说明。
在一种可能的实现方式中,在路侧设备1和路侧设备2之间建立了通信连接的情况下,对一个区域的通信能力,以路侧设备1和路侧设备2中最好的通信能力为准。对于一个区域,若路侧设备1的通信能力与路侧设备2的通信能力均未达到阈值T1,则可以确定该区域为绝 对通信盲区。这种情况下,可以不标记相对通信盲区。
在一种可能的实现方式中,在路侧设备1和路侧设备2之间未建立通信连接的情况下,将路侧设备1的通信能力未达到阈值T1但路侧设备2的通信能力达到了阈值T1的区域,以及路侧设备2的通信能力未达到阈值T1但路侧设备1的通信能力达到了阈值T1,确定为相对通信盲区;将两者的通信能力均未达到阈值T1的区域确定为绝对通信盲区。
在一个示例中,可以为绝对通信盲区和相对通信盲区添加不同的标识。例如,为绝对通信盲区添加第一标识,为相对通信盲区添加第二标识。这样,根据标识即可确定一个通信盲区时绝对通信盲区还是相对通信盲区。可选的,在标识相对通信盲区时,还可以将相对通信盲区与路侧设备的标识关联,以明确一个相对通信盲区是哪个路侧设备的通信盲区。
在又一示例中,可以将路侧设备的通信能力信息与该路侧设备建立了通信连接的路侧设备进行关联。这样,用户可以自行判断出路侧设备与哪些路侧设备建立了通信连接,从而确定出哪里是绝对通信盲区哪里是相对通信盲区。
在一种可能的实现方式中,所述方法还包括:根据所述第一通信能力信息生成预警提示信息。其中,预警提示信息可以用于提示在第二区域内由驾驶员接管车辆、对所述第一路侧设备进行故障检测、更新所述第一路侧设备的软件,或者调整所述第一路侧设备的部署、在所述第二区域降低来自所述第一路侧设备的信息的置信度或者在规划路径时避开所述第二区域,其中所述第一通信能力信息指示所述第一路侧设备在所述第二区域内的通信能力低于第一阈值。
其中,第一通信能力信息指示第一路侧设备在第二区域内的通信能力低于第一阈值。第一阈值可以根据需要进行设置。在一个示例中,低于第一阈值可以包括但不限于:未达到预置通信能力级别(例如,未达到一级通信能力级别或者未达到二级通信能力级别等)、第一位置点的密度未达到预置的密度阈值、稳定连接率未达到预置的稳定性阈值中的一者或多者。这里的密度阈值和稳定性阈值可以根据需要进行设置,本申请实施例不做限制。考虑到阈值T1和阈值T2用于判定通信盲区,第一阈值用于进行预警,在非通信盲区但是通信效果较差的区域是需要进行预警的,因此,在一个示例中,第一阈值可以大于(高于)或者等于阈值T1以及阈值T2。
由于在第二区域内第一路侧设备的通信能力低于第一阈值,代表第一路侧设备在第二区域内的通信效果较差,第一路侧设备无法较准确且全面的与第二区域内的终端设备进行通信,因此,也就无法保证第一路侧设备可以将其获得的信息(包括自身感知到的信息以及从其他设备收集到的信息)传递给第二区域内的每个终端设备。因此,在第二区域内车辆进行自动驾驶时数据来源可能不够多,风险较高,驾驶员可以在第二区域内接管车辆。同时,可以对第一路侧设备进行故障检查,查看是否因为第一路侧设备发生了故障而造成了第一路侧设备在第二区内通信效果较差,特别是在第二区域距离第一路侧设备较近的情况下。还可以更新第一路侧设备的软件或者调整第一路侧设备的部署,使得第一路侧设备的通信能力范围更加合理。另外,由于第一路侧设备在第二区域内的通信效果较差,第一路侧设备收集的第二区域内的终端设备的信息并不能较好的代表第二区域内的实际情况,因此,在第二区域内需要降低第一路侧设备获得的信息的置信度。由于第一路侧设备在第二区域内的通信效果较差,因此在路径规划时可以避开第二区域,这样可以降低车辆进入第二区域后发生事故的可能性,特别是对于自动驾驶的车辆,避开第二区域行驶,就不需要驾驶员接管车辆,可以有效提升用户体验。
示例性地,请参见图8A和图8B,图8A是本申请实施例适用的一种可能的场景示意图,图8B是本申请实施例提供的一种可能的覆盖区域的示意图。感知设备801和感知设备802属于同一个感知设备组,可以对道路情况进行感知。感知设备801的感知覆盖区域与感知设备802的感知覆盖区域如图8B所示。
请参见表4,表4是本申请实施例提供的一种可能的覆盖信息,表4所示的覆盖信息用于示例性地描述图8A以及图8B所示的覆盖区域。示例性地,感知设备组1对应的覆盖能力,由感知设备1和感知设备2的覆盖能力融合得到的;感知设备组1的覆盖区域,是根据融合后的覆盖能力得到的。可选的,在感知设备组1包含多个覆盖区域时,多个覆盖区域按照融合后的覆盖能力按等级划分。
表4覆盖信息
Figure PCTCN2022113648-appb-000001
可选的,覆盖信息中的覆盖区域可以是融合多个设备的覆盖区域得到的。进一步可选的,覆盖信息中覆盖能力信息也可以是融合多个设备的覆盖能力得到的。例如,以感知设备组1的覆盖区域6为例,覆盖区域6可以是融合感知设备1的覆盖区域7和感知设备2的覆盖区域8得到的。其中,融合可以理解为:覆盖区域6是通过覆盖区域7和覆盖区域8的重叠部分得到。当然,在一些具体实施过程中,还可以通过拟合、强化学习模型、深度学习模型、或者预置的计算方式进行融合,本申请对于上述融合感知区域的方法同样适用。而感知设备组1的覆盖区域6的覆盖能力信息可以是根据感知设备1的覆盖能力信息和感知设备2的覆盖能力信息确定的。示例性地,感知设备组1在覆盖区域6内的覆盖能力,通过融合感知设备1和感知设备2的覆盖能力得到。其中,覆盖能力信息的融合也可以拟合、强化学习模型、深度学习模型、或者预置的计算方式进行融合,本申请对于上述多种融合方式的同样适用。
在又一种可能的设计中,在覆盖信息包含多个覆盖区域时,多个覆盖区域之间可以有重叠区域。可选的,在覆盖信息中还包含多个覆盖区域所对应的多个覆盖能力的信息。例如,参见图8B,感知设备801的覆盖区域7和感知设备802的覆盖区域8之间可以存在重叠区域。为了便于描述,请参见表5,表5是本申请实施例提供的又一种可能的覆盖信息,表5所示的覆盖信息用于示例性地描述图8A以及图8B所示的覆盖区域。如表5所示的覆盖信息可以包含感知设备801的覆盖区域7以及对应覆盖区域7的覆盖能力的信息(示例性地,对应的覆盖能力信息可以为感知结果正确率>98%且召回率>94%),还可以包含感知设备802的覆盖区域8以及对应覆盖区域8的覆盖能力的信息(示例性地,对应的覆盖能力信息可以为感知结果正确率>95%且召回率>90%)。
表5覆盖信息
Figure PCTCN2022113648-appb-000002
应理解,多个覆盖区域之间也可以没有重叠区域,本申请对于多个覆盖区域之间没有重叠区域的情况仍然适用。
可选的,覆盖信息中还可以包含盲区的信息,其中盲区可以包含通信盲区、感知盲区等等中的至少一个。应理解,覆盖信息中的覆盖区域可以是根据不同等级的覆盖能力划分得到的,因此,盲区也可以对应不同的盲区等级。例如,将感知结果正确率低于40%的区域作为一级感知盲区,将感知结果正确率低于10%的区域作为二级感知盲区。
可选的,通信盲区和感知盲区可以分别独立,也可以进行如取交集等处理。例如,参见图9,图9是本申请实施例提供的一种可能的盲区示意图,其中范围1(Scope1)为感知设备901的一个覆盖区域,范围2(Scope2)为通信设备902的通信能力对应的一个覆盖区域。其中,A路段在两个覆盖区域内,而B路段、C路段属于感知盲区,但是并不完全属于通信盲区,因此位于B、C路段的车辆或者其他装置仍能够接收来自感知设备901的感知结果。
可选的,所述路侧设备在所述至少一个覆盖区域内的覆盖能力信息指示多种环境下的多种能力。例如,晴天、雨天、雾霾天气等不同的天气环境下,覆盖能力信息可以是不同的。再如,白天、夜晚等不同的时刻,或者不同的温度、湿度、亮度条件下,路侧设备的覆盖能力信息可以是不同的。
在一种可能的设计中,覆盖信息中可以包含用于指示适用场景的信息。例如,请参见表6,覆盖信息中包含适用场景字段,该字段用于指示感知设备3在不同环境下的覆盖能力,在后续使用覆盖信息时,可以合理地考虑场景因素,提高覆盖能力的覆盖范围的准确性,提升可靠性。
表6覆盖信息
Figure PCTCN2022113648-appb-000003
应理解,表6以适用场景为例进行说明,具体实施过程中,覆盖信息也可以包含季节、时段、天气、温度、湿度、亮度等等中的一个或者多个字段。
可选的,覆盖信息中还可以包含路侧设备的标识、瓦片标识(ID)等等中的一项或者多项。其中,瓦片是瓦片地图中的一个组成部分。在一种可能的设计中,当覆盖信息为地图数据时,覆盖信息可以包括瓦片ID。
通过瓦片ID可以将覆盖信息关联到瓦片,可以便于例用覆盖信息更新地图,便于存储和管理覆盖信息。
步骤S402:所述数据处理装置将所述覆盖信息存储为地图数据。
数据处理装置可以将获取的覆盖信息直接进行存储,也可以对该覆盖信息进行处理后再 进行存储,处理后的覆盖信息更符合地图数据的存储要求,形式可能与获取的覆盖信息不同,但是指示的内容是一致的。
将所述覆盖信息存储为地图数据,是指将该覆盖信息作为一种在地图中承载的信息,采用地图中其他信息的编译形式或存储格式,存储于云端、路侧或终端的存储介质中。示例性地,请参见图11,图11是本申请实施例提供的一种可能的覆盖信息作为地图数据的数据结构示意图。其中,瓦片ID用于标识地图瓦片,路侧ID用于标识路侧设备。以通信覆盖范围为例,每一个路侧ID的下层包含了该路侧ID对应的通信覆盖范围的信息,具体可以包括默认使用范围级别、以及至少一个等级的覆盖区域。其中,默认使用范围级别用于指示默认显示哪一个级别的覆盖能力对应的覆盖区域。等级(例如一级范围、二级范围、三级范围等)用于指示不同的覆盖能力。范围等级的下层可以包含覆盖区域,可选还包含该等级指示的内容(即图示的指标)。
可选的,图11所示的数据结构中,范围等级的下层可以包含该等级指示的指标(或者说内容、指标项)以及指标对应的数值(或者取值范围),例如,“指标:正确率,取值范围:≥90”。
图11所示的数据结构仅为示例,在路侧设备包含多个感知设备或者通信设备的情况下,路侧设备的ID的下层可以包含多个感知设备(或者感知设备组的)的ID、或者通信设备(或者通信设备组)的ID。示例性地,请参见图12,图12是本申请实施例提供的又一种可能的覆盖信息的结构示意图,每一个路侧ID的下层包含了传感器(或者称为感知设备)ID、或者传感器组ID等。示例性地,图12所示的该数据结构中,路侧设备ID的下层可以包含传感器组的标识、传感器的标识等。传感器组的标识下层包含了该传感器组的传感器列表、传感器列表中每一个传感器的工作状态(例如可以为正常工作、故障等等工作状态)、默认使用范围级别、工作模式(包含融合模式、单传感器模式等模式)等中的一项或者多项。其中,默认使用范围级别用于指示默认显示哪一个级别的覆盖能力对应的覆盖区域。等级(例如一级范围、二级范围、三级范围等)用于指示不同的覆盖能力。范围等级的下层可以包含覆盖区域,可选还包含该等级指示的内容(即图示的指标)。传感器的工作情况、故障状态、是否使用默认的范围级别、是否处于融合(FUSION)模式等等。以传感器组内包含激光雷达(lidar)和视觉传感器(camera,或者称为相机器、图像传感器、或摄像头等)的融合传感器为例,该覆盖信息的数据结构中可以包含融合得到的感知区域和覆盖能力,也包含单激光雷达的感知区域和覆盖能力,还可以包含单视觉传感器的感知区域和覆盖能力。
一种可能的设计中,数据处理装置可以根据所述覆盖能力生成第一图层,该第一图层属于前述的地图。示例性地,请参见图10A,图10A是本申请实施例提供的一种可能的地图图层的示意图,图10A所示的地图可以包含覆盖范围图层、道路图层、建筑图层、拥堵状态图层等等图层(仅为示例)。
可以将该覆盖信息显示于显示界面上。具体地,关于覆盖信息的地图图层可以单独显示,也可以以叠加的方式与其他地图图层一起显示于地图显示界面上。示例性地,请参见图10B,图10B是本申请实施例提供的一种可能的地图的示意图,将覆盖范围图层、拥堵状态图层、道路图层、建筑图层叠加显示可以得到如图10B所示的地图。
应理解,数据处理装置还可以根据覆盖信息更新该地图中的覆盖范围图层,其中,更新地图包含增加覆盖区域、减少覆盖区域、修改覆盖区域、修改能力信息等中的一项或者多项。例如,根据环境变化选择不同的环境下的覆盖区域进行显示、或者在路侧设备故障时停止显示故障的路侧设备的覆盖区域。在一种设计中,数据处理装置可以接收更新指示信息,从而 对地图进行更新。
又一种可能的设计中,数据处理装置可以根据覆盖信息,确定地图对应的数据结构。后续数据处理装置可以通过地图对应的数据结构,更新地图。
可选的,上述图11或图12的数据结构中,可以包含盲区。盲区也可以包含不同的等级,具体可以参考前述,此处不再赘述。
在一种可能的设计中,数据处理装置作为地图生成装置,可以在生成或更新包括所述覆盖信息的地图后,将地图发送给其他装置(车辆、路侧设备、辅助驾驶服务器等)。
数据处理装置还可以利用覆盖信息,进行信息处理。例如,数据处理装置根据覆盖能力确定车辆的安全等级、车辆的驾驶策略等等中的一项或者多项。
示例性地,本申请实施例例举如下两种可能的设计:
设计1,数据处理装置根据覆盖能力确定车辆的安全等级。其中,车辆的安全等级可以用于确定车辆的自动驾驶装置参与车辆操作的权重。请参见表7,表7是本申请实施例例举的一种可能的车辆的安全等级表。可以看出,当车辆位于一级范围对应的区域内时,安全等级为1级,此时数据处理装置可以根据路侧设备的感知结果或者通信数据来应对驾驶场景,可以无需驾驶员。其他安全等级可以参见表述。
表7车辆的安全等级
Figure PCTCN2022113648-appb-000004
设计2,数据处理装置根据覆盖能力确定车辆的驾驶策略。其中,驾驶策略可以包含安全等级、感知结果的置信度、是否需要驾驶员接管车辆、是否启动自动驾驶(或辅助驾驶)等等中的一项或者多项。例如,覆盖能力对应有第一覆盖区域,车辆可以根据是否在第一覆盖区域内从而确定当前路侧设备的覆盖能力,进而对驾驶策略进行调整。
例如,响应于所述第一车辆位于所述第一覆盖区域内,将所述第一车辆的安全等级确定为高安全等级。或者,响应于所述第一车辆位于所述第一覆盖区域内,提高所述路侧设备的感知结果的置信度。或者,响应于所述第一车辆位于所述第一覆盖区域内,触发第一提醒消息,所述第一提醒消息用于提醒用户开启所述第一车辆的自动驾驶功能或者开启所述第一车辆的辅助驾驶功能。或者,响应于所述第一车辆离开所述第一覆盖区域内,触发第二提醒消息,所述第二提醒消息用于提醒用户接管所述第一车辆。
本申请例举一种可能的设计,请参见图7,以数据处理装置包含于车辆702中为例,当车辆702位于正确率为90%的感知覆盖区域内,此时可以提高车辆的安全等级。类似的,当车辆702位于正确率为90%的感知覆盖区域内,可以提高该路侧设备的感知结果的置信度。类似的,大概车辆位于正确率为90%的感知覆盖区域内,可以触发第一提醒消息,该第一提醒消息用于提醒用户开启所述第一车辆的自动驾驶功能或者开启所述第一车辆的辅助驾驶功能。
可选的,当车辆驶出该正确率为90%的感知覆盖区域时,可以触发第二提醒消息,所述第二提醒消息用于提醒用户接管所述第一车辆。可以看出,车辆位于正确率为90%的感知覆盖区域时,此时路侧设备对于车辆702以及车辆702周围的环境的感知结果较为准确,而此时提高感知结果的置信度,可以使得根据感知结果可以确定出更可靠的驾驶操作,提高安全性。
在一种可能的设计中,数据处理装置可以根据覆盖信息得到盲区,从而控制车辆的动作。例如,参见图9,图9是本申请实施例提供的一种可能的盲区范围的示意图,当车辆位于通信设备902的通信盲区,可以主动切断与通信设备902的通信连接,避免不稳定的连接占用车辆的通信和处理资源;当车辆位于感知设备901的感知盲区或者车辆所需要的探测结果位于感知设备901的感知盲区,可以降低感知设备901的感知结果的置信度或者不使用来自感知设备901的感知结果。
本申请实施例中对覆盖信息的内容进行了设计,满足了对路侧设备的覆盖范围的使用需求。后续在车辆或者服务提供商使用路侧设备提供的服务时,可以根据覆盖信息确定路侧设备的覆盖区域和在该区域内的覆盖能力,得到路侧设备所提供的服务的可靠性、鲁棒性等等。例如通过覆盖信息,更精确地得到路侧设备在某一区域内的感知结果的置信度、或在某一区域内与路侧设备之间通信连接的鲁棒性等指标,提升了自动驾驶或者辅助驾驶的可靠性。
请参阅图13,图13是本申请实施例提供的又一种数据处理方法的流程示意图。可选的,图13所示的数据处理方法可适用于上述图1中所示的场景。该数据处理方法至少可以包括如下步骤:
步骤S1301:第一数据处理装置生成路侧设备的覆盖信息。
其中,第一数据处理装置可以为终端设备(例如路侧设备、或车辆等)也可以是网络侧设备(例如服务器、或云端等)。
可选的,用于生成路侧设备的覆盖信息的参数也可以是路侧设备上报的,也可以是第一数据处理装置自己采集的,还可以是第一数据处理装置根据路侧设备的感知结果、通信结果等计算得到的。
覆盖信息,也可以称为覆盖数据,包括路侧设备的覆盖区域、路侧设备在所述覆盖区域内的覆盖能力信息等。其中,路侧设备的覆盖区域在路侧设备的覆盖范围内。覆盖能力具体为路侧设备在覆盖区域内的覆盖能力,覆盖能力可以通过覆盖能力信息来描述。关于覆盖信息、覆盖区域、覆盖能力的详细描述可以参考步骤S401中的相关说明,此处不在赘述。
覆盖能力具体可以指示不同的指标,或者称为内容。在一种可能的设计,覆盖能力为路侧设备在通信覆盖区域内的覆盖能力,覆盖能力信息可以用于指示以下内容(或者说指标)中的至少一项:数据正确率、丢包率、通信时延、通信稳定性或信号强度等等中的至少一项。在又一种可能的设计,覆盖能力为路侧设备在感知覆盖区域内的覆盖能力,所述覆盖能力信息用于指示以下内容(或者说指标)中的至少一项:感知结果正确率、误检率、漏检率、召回率、感知精度、感知平均精度、检测稳定性或检测位置精度等等中的至少一项。关于内容(指标)的详细描述可以参考步骤S401中的相关说明,此处不在赘述。
覆盖信息中包括的覆盖区域的数量可以是一个或者多个。相应的,覆盖能力信息也可以是一个或者多个。可选的,包含多个感知覆盖区域时,多个感知覆盖区域可以是按照感知能力的等级划分的。类似的,包含多个通信覆盖区域时,多个通信覆盖区域可以是按照通信能力的等级划分的。
可选的,在覆盖信息包含多个覆盖区域时,多个覆盖区域之间可以有重叠区域。
可选的,路侧设备可以包含一个或者多个感知设备,或者可以与一个或多个感知设备连接。路侧设备的感知能力具体可以通过感知设备实现。进一步可选的,感知设备可以进行组合,一个或者多个感知设备可以组成感知设备组。
进一步可选的,在路侧设备包含多个感知设备(或者与多个感知设备连接)的情况下,覆盖信息中的感知覆盖区域可以对应感知设备或者感知设备组。在一种可能的设计中,对应感知设备组的感知覆盖区域和感知覆盖区域内的覆盖能力是根据感知设备组内的感知设备的覆盖能力确定的。例如,覆盖信息中的覆盖区域可以是融合多个单个设备的覆盖区域得到的。
可选的,覆盖信息中还可以包含盲区的信息,其中盲区可以包含通信盲区、感知盲区等等中的至少一个。
可选的,所述路侧设备在所述至少一个覆盖区域内的覆盖能力信息指示多种环境下的多种能力。
可选的,覆盖信息中还可以包含路侧设备的标识、瓦片ID等等中的一项或者多项。
步骤S1301中的相关概念的详细描述可以参考步骤S401中的相关说明,此处不在赘述。
步骤S1302:第一数据处理装置发送覆盖信息。
具体的,第一数据处理装置可以通过有线、无线链路或者有线无线组合链路的方式与其他装置通信,从而可以向其他装置发送覆盖信息。可选的,该第一数据处理装置与其他装置之间收发信息的数据链路可以包括各种类型的连接介质,具体可以是有线链路(如光纤等)、或者无线链路、或者有线链路与无线链路的组合等等。例如可以为包括802.11b/g、蓝牙(Blue Tooth)、紫蜂(Zigbee)、车载短距无线通信技术、全球移动通信系统(Global System for Mobile communications,GSM)、通用分组无线业务(General Packet Radio Service,GPRS)、通用移动通信系统(Universal Mobile Telecommunications System,UMTS)、超宽带(Ultra Wideband,UWB)技术、车载无线传输技术等。当然,不排除还有其他技术可以用于支持该第一数据处理装置与其他装置之间进行通信。
步骤S1303:第二数据处理装置获取路侧设备的覆盖信息。
可理解的,第一数据处理装置可以向第二数据处理装置发送路侧设备的覆盖信息。相应的,第二数据处理装置接收来自第一数据处理装置的路侧设备的覆盖信息。
第二数据处理装置可以为终端设备(例如路侧设备、或车辆等)也可以是网络侧设备(例如服务器、或云端等)。
例如,请参见图7,第一数据处理装置可以为路侧设备701,第二数据处理装置可以为车辆702(或云端703)。路侧装置可以根据自身的覆盖能力生成覆盖信息,还可以将覆盖信息发送给车辆702(或云端703)。相应的,车辆702(或云端703)则获取了覆盖信息。
再如,第一数据处理装置可以为云端703,第二数据处理装置可以为车辆702(路侧设备701)。第一数据处理装置可以向车辆702(路侧设备701)发送覆盖信息,相应的,车辆702(路侧设备701)可以接收云端703发送的覆盖信息。可选的,该覆盖信息可以包含路侧设备701的覆盖信息,可选还包含其他路侧设备(图7中未示出)的覆盖信息。
步骤S1304:第二数据处理装置利用覆盖信息,更新地图或者控制车辆的动作,即生成用于控制车辆的控制信号。
具体可以参考步骤S402中的相关描述,此处不在赘述。
本申请实施例还提供一种覆盖数据,所述覆盖数据用于描述路侧设备的覆盖范围。具体 的,所述覆盖数据包含覆盖区域、路侧设备在覆盖区域内的覆盖能力等等。其中,路侧设备的覆盖区域在路侧设备的覆盖范围内,可以包括感知覆盖区域、或通信覆盖区域等。覆盖能力具体为路侧设备在覆盖区域内的覆盖能力,覆盖能力可以通过覆盖能力信息来描述。覆盖能力具体可以指示不同的指标,或者称为内容。
在一种可能的设计中,覆盖能力为路侧设备在通信覆盖区域内的覆盖能力,覆盖能力信息可以用于指示以下内容(或者说指标)中的至少一项:数据正确率、丢包率、通信时延、通信稳定性或信号强度等等中的至少一项。
在又一种可能的设计中,覆盖能力为路侧设备在感知覆盖区域内的覆盖能力,所述覆盖能力信息用于指示以下内容(或者说指标)中的至少一项:感知结果正确率、误检率、漏检率、召回率、感知精度、感知平均精度(Average Precision,AP)、检测稳定性或检测位置精度等等中的至少一项。
覆盖信息中包括的覆盖区域的数量可以是一个或者多个。相应的,覆盖能力信息也可以是一个或者多个。可选的,包含多个感知覆盖区域时,多个感知覆盖区域可以是按照感知能力的等级划分的。类似的,包含多个通信覆盖区域时,多个通信覆盖区域可以是按照通信能力的等级划分的。
进一步可选的,在路侧设备包含多个感知设备(或者与多个感知设备连接)的情况下,覆盖信息中的感知覆盖区域可以对应感知设备或者感知设备组。在一种可能的设计中,对应感知设备组的感知覆盖区域和感知覆盖区域内的覆盖能力是根据感知设备组内的感知设备的覆盖能力确定的。例如,覆盖信息中的覆盖区域可以是融合多个单个设备的覆盖区域得到的。
可选的,覆盖信息中还可以包含路侧设备的标识、瓦片ID、盲区的信息、路侧设备的ID、等中的一项或者多项,其中盲区可以包含通信盲区、感知盲区等等中的至少一个。
可选的,覆盖数据可以分为多个层级进行表征。
一种可能的通信覆盖数据中,第一层级为路侧设备ID,路侧设备ID的下一层级(便于描述称为第二层级)包含多个范围等级,每一个范围等级的下一层级(便于描述称为第三层级)包含覆盖能力信息和覆盖区域(或者说覆盖区域的指示信息)。示例性地,通信覆盖数据的结构可以如图11所示。
一种可能的感知覆盖数据中,其中第一层级为路侧设备ID,路侧设备ID的下一层级(便于描述称为第二层级)包含感知设备ID或感知设备组ID。感知设备ID(或感知设备组ID)的下一层级(便于描述称为第三层级)包含多个范围等级。范围等级的下一层级(便于描述称为第四层级)包含覆盖能力信息和覆盖区域(或者说覆盖区域的指示信息)。示例性地,通信覆盖数据的结构可以如图12所示。
上述详细阐述了本申请实施例的方法和覆盖数据,下面提供了本申请实施例的装置。
请参见图14,图14是本申请实施例提供的一种数据处理装置140(以下称为装置140)的结构示意图,该装置140可以为独立设备,也可以为独立设备中的一个器件,例如芯片或者集成电路等。
在一种可能的设计中,该装置140可以为图4所示实施例中的数据处理装置,或者为数据处理装置中的一个器件,例如芯片或者集成电路。
在又一种可能的设计中,该装置140可以为图13所示实施例中的第二数据处理装置,或者为第二数据处理装置中的一个器件,例如芯片或者集成电路。
装置140包括获取单元1401和存储单元1402。
其中,获取单元1401,用于获取路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息。
其中,存储单元1402,用于将所述覆盖信息存储为地图数据。
关于其中覆盖信息、覆盖区域、覆盖区域信息、覆盖能力和覆盖能力信息,请参见上文描述,此处不再赘述。
可以理解的,本申请各个装置实施例中,对多个单元或者模块的划分仅是一种根据功能进行的逻辑划分,不作为对装置具体的结构的限定。在具体实现中,其中部分功能模块可能被细分为更多细小的功能模块,部分功能模块也可能组合成一个功能模块,但无论这些功能模块是进行了细分还是组合,装置140在数据处理过程中所执行的大致流程是相同的。通常,每个单元都对应有各自的程序代码(或者说程序指令),这些单元各自对应的程序代码在处理器上运行时,使得该单元受处理器的控制而执行相应的流程从而实现相应功能。
请参见图15,图15是本申请实施例提供的一种数据处理装置150(以下称为装置150)的结构示意图,该装置150可以为独立设备,也可以为独立设备中的一个器件,例如芯片或者集成电路等。
装置150包括处理单元1501、存储单元1502、通信单元1503和显示单元1504。
在一种情况下,处理单元1501用于生成路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息;存储单元1502用于将处理单元1501生成的覆盖信息存储为地图数据。
在另一种情况下通信单元1503用于接收路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息;存储单元1502用于将通信单元1503接收的覆盖信息存储为地图数据。
关于其中覆盖信息、覆盖区域、覆盖区域信息、覆盖能力和覆盖能力信息,请参见上文描述,此处不再赘述。
显示单元1504作为装置150内的一个可选单元,用于将上述覆盖信息在显示界面上进行显示。
可选地,通信单元1503用于发送所述覆盖信息。
可选地,处理单元1501还用于利用所述覆盖信息,生成用于控制所述车辆的控制信号。
可选地,处理单元1501还用于利用所述覆盖信息,进行信息处理,例如确定感知信息的置信度或者确定车辆所述的安全等级。
请参见图16,图16是本申请实施例提供的一种数据处理装置160的结构示意图,该装置160可以为独立设备(例如节点、终端等等中的一个),也可以为独立设备中的一个器件,例如芯片或者集成电路等。该装置160可以包括至少一个处理器1601和通信接口1602。进一步可选的,所述装置160还可以包括至少一个存储器1603。更进一步可选的,还可以包含总线1604,其中,处理器1601、通信接口1602和存储器1603通过总线1604相连。
其中,处理器1601是进行算术运算和/或逻辑运算的模块,具体可以是中央处理器(central processing unit,CPU)、图片处理器(graphics processing unit,GPU)、微处理器(microprocessor  unit,MPU)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程逻辑门阵列(Field Programmable Gate Array,FPGA)、复杂可编程逻辑器件(Complex programmable logic device,CPLD)、协处理器(协助中央处理器完成相应处理和应用)、微控制单元(Microcontroller Unit,MCU)等处理模块中的一种或者多种的组合。
通信接口1602可以用于为所述至少一个处理器提供信息输入或者输出。和/或,所述通信接口1602可以用于接收外部发送的数据和/或向外部发送数据,可以为包括诸如以太网电缆等的有线链路接口,也可以是无线链路(Wi-Fi、蓝牙、通用无线传输、车载短距通信技术以及其他短距无线通信技术等)接口。可选的,通信接口1602还可以包括与接口耦合的发射器(如射频发射器、天线等),或者接收器等。
例如,通信接口1602还可以包括天线。电磁波经过天线被接收,通信接口1602还可以将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器1601。再如,通信接口1602还可以从处理器1601接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
存储器1603用于提供存储空间,存储空间中可以存储操作系统和计算机程序等数据。存储器1603可以是随机存储记忆体(random access memory,RAM)、只读存储器(read-only memory,ROM)、可擦除可编程只读存储器(erasable programmable read only memory,EPROM)、或便携式只读存储器(compact disc read-only memory,CD-ROM)等等中的一种或者多种的组合。
该装置160中的至少一个处理器1601用于调用至少一个存储器1603中存储的计算机程序,用于执行前述的方法,例如图4、图13所示实施例所描述的方法。
在一种可能的设计中,该装置160可以为图4所示实施例中的数据处理装置,或者为数据处理装置中的一个器件,例如芯片或者集成电路。
在又一种可能的设计中,该装置160可以为图13所示实施例中的第二数据处理装置,或者为第二数据处理装置中的一个器件,例如芯片或者集成电路。
本申请实施例还提供了一种终端,所述终端用于实现图4或者图13所示的实施例所描述的方法。所述终端包括但不限于车辆或者便携终端。
在一种设计中,所述终端包含前述的装置,例如图14、图15或者图16所示的装置。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,当所述计算机程序在一个或多个处理器上运行时,实现图4或图13所示的实施例所描述的方法。
本申请实施例还提供了一种计算机程序产品,当所述计算机程序产品在一个或多个处理器上运行时,实现图4所示的实施例所述的方法。
本申请实施例还提供了一种芯片系统,所述芯片系统包括通信接口和至少一个处理器,该通信接口用于为上述至少一个处理器提供信息输入/输出,和/或,所述通信接口用于为发送或者接收数据。所述处理器用于调用计算机程序(或者计算机指令),以实现图4或图13所示的实施例所述的方法。
需要说明的是,本申请中存储器中的计算机程序可以预先存储也可以使用该设备时从互联网下载后存储,本申请对于存储器中计算机程序的来源不进行具体限定。本申请实施例中的耦合是装置、单元或模块之间的间接耦合或连接,其可以是电性,机械或其它的形式,用于装置、单元或模块之间的信息交互。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当 使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (31)

  1. 一种数据处理方法,其特征在于,包括:
    获取路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息;
    将所述覆盖信息存储为地图数据。
  2. 根据权利要求1所述的方法,其特征在于,所述至少一个覆盖区域根据所述至少一个覆盖区域的覆盖能力按等级划分。
  3. 根据权利要求1或2所述的方法,其特征在于,所述至少一个覆盖区域包括M个通信覆盖区域和N个感知覆盖区域,其中,所述M和所述N为自然数,且所述M和所述N不同时为0。
  4. 根据权利要求3所述的方法,其特征在于,所述N个感知覆盖区域包括多设备感知覆盖区域,所述多设备感知覆盖区域和所述路侧设备在所述多设备感知覆盖区域内的覆盖能力是根据与所述路侧设备相关的多个感知设备的覆盖能力确定的。
  5. 根据权利要求3所述的方法,其特征在于,
    所述路侧设备与第一感知设备和第二感知设备相关,所述N个感知覆盖区域包括所述第一感知设备的第一覆盖区域和所述第二感知设备的第二覆盖区域,所述覆盖能力信息包括用于指示所述第一感知设备在所述第一覆盖区域内的覆盖能力的第一覆盖能力信息和用于指示所述第二感知设备在所述第二覆盖区域内的覆盖能力的第二覆盖能力信息。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述覆盖信息还包括用于指示盲区的信息,所述盲区包括通信盲区,感知盲区,或通信盲区和感知盲区。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,当所述覆盖能力为所述路侧设备在通信覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
    数据正确率、丢包率、通信时延、通信稳定性和信号强度。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,当所述覆盖能力为所述路侧设备在感知覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
    感知结果正确率、误检率、漏检率、召回率、感知精度、检测稳定性和检测位置精度。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述覆盖能力信息指示多种环境下的所述覆盖能力。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述覆盖信息还包括所述路侧设备的标识。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,所述覆盖信息还包括瓦片标识。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述方法还包括:
    将所述覆盖信息在显示界面上进行显示。
  13. 根据权利要求1-12任一项所述的方法,其特征在于,所述方法还包括:
    发送所述覆盖信息。
  14. 根据权利要求1-13任一项所述的方法,其特征在于,所述方法还包括:
    利用所述覆盖信息,进行信息处理或者生成用于控制车辆的控制信号。
  15. 一种数据处理装置,其特征在于,包括:
    获取单元,用于获取路侧设备的覆盖信息,所述覆盖信息包括用于指示所述路侧设备的至少一个覆盖区域的覆盖区域信息以及用于指示所述路侧设备在所述至少一个覆盖区域内的覆盖能力的覆盖能力信息;
    存储单元,用于将所述覆盖信息存储为地图数据。
  16. 根据权利要求15所述的装置,其特征在于,所述至少一个覆盖区域根据所述至少一个覆盖区域的覆盖能力按等级划分。
  17. 根据权利要求15或16所述的装置,其特征在于,所述至少一个覆盖区域包括M个通信覆盖区域和N个感知覆盖区域,其中,所述M和所述N为自然数,且所述M和所述N不同时为0。
  18. 根据权利要求17所述的装置,其特征在于,所述N个感知覆盖区域包括多设备感知覆盖区域,所述多设备感知覆盖区域和所述路侧设备在所述多设备感知覆盖区域内的覆盖能力是根据与所述路侧设备相关的多个感知设备的覆盖能力确定的。
  19. 根据权利要求17所述的装置,其特征在于,
    所述路侧设备与第一感知设备和第二感知设备相关,所述N个感知覆盖区域包括所述第一感知设备的第一覆盖区域和所述第二感知设备的第二覆盖区域,所述覆盖能力信息包括用于指示所述第一感知设备在所述第一覆盖区域内的覆盖能力的第一覆盖能力信息和用于指示所述第二感知设备在所述第二覆盖区域内的覆盖能力的第二覆盖能力信息。
  20. 根据权利要求15-19任一项所述的装置,其特征在于,所述覆盖信息还包括用于指示盲区的信息,所述盲区包括通信盲区,感知盲区,或通信盲区和感知盲区。
  21. 根据权利要求15-20任一项所述的装置,其特征在于,当所述覆盖能力为所述路侧设备在通信覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
    数据正确率、丢包率、通信时延、通信稳定性和信号强度。
  22. 根据权利要求15-21任一项所述的装置,其特征在于,当所述覆盖能力为所述路侧设备在感知覆盖区域内的覆盖能力时,所述覆盖能力信息用于指示以下内容中的至少一项:
    感知结果正确率、误检率、漏检率、召回率、感知精度、检测稳定性和检测位置精度。
  23. 根据权利要求15-22任一项所述的装置,其特征在于,所述覆盖能力信息指示多种环境下的所述覆盖能力。
  24. 根据权利要求15-23任一项所述的装置,其特征在于,所述覆盖信息还包括所述路侧设备的标识。
  25. 根据权利要求15-24任一项所述的装置,其特征在于,所述覆盖信息还包括瓦片标识。
  26. 根据权利要求15-25任一项所述的装置,其特征在于,所述装置还包括:
    显示单元,用于将所述覆盖信息在显示界面上进行显示。
  27. 根据权利要求15-26任一项所述的装置,其特征在于,所述装置还包括:
    通信单元,用于发送所述覆盖信息。
  28. 根据权利要求15-27任一项所述的装置,其特征在于,所述装置还包括:
    处理单元,用于利用所述覆盖信息,进行信息处理或者生成用于控制车辆的控制信号。
  29. 一种数据处理装置,其特征在于,包括处理器和通信接口,
    所述通信接口用于接收计算机执行指令并传输至所述处理器;
    所述处理器用于执行所述计算机执行指令,以使所述数据处理装置执行如权利要求1-14中任一项所述的方法。
  30. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行以实现如权利要求1-14任一项所述的方法。
  31. 一种车辆,其特征在于,所述车辆包括如权利要求15-29任一项所述的数据处理装置。
PCT/CN2022/113648 2021-08-27 2022-08-19 数据处理方法及装置 WO2023025061A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22860407.0A EP4379570A1 (en) 2021-08-27 2022-08-19 Data processing method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110996319.4 2021-08-27
CN202110996319.4A CN115905431A (zh) 2021-08-27 2021-08-27 数据处理方法及装置

Publications (1)

Publication Number Publication Date
WO2023025061A1 true WO2023025061A1 (zh) 2023-03-02

Family

ID=85321517

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/113648 WO2023025061A1 (zh) 2021-08-27 2022-08-19 数据处理方法及装置

Country Status (3)

Country Link
EP (1) EP4379570A1 (zh)
CN (1) CN115905431A (zh)
WO (1) WO2023025061A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108605205A (zh) * 2016-01-29 2018-09-28 三星电子株式会社 用于确定电子装置的位置的设备和方法
US20190132709A1 (en) * 2018-12-27 2019-05-02 Ralf Graefe Sensor network enhancement mechanisms
CN111198890A (zh) * 2018-11-20 2020-05-26 北京图森智途科技有限公司 地图更新方法、路侧设备、车载装置、车辆和系统
WO2020253613A1 (zh) * 2019-06-17 2020-12-24 华为技术有限公司 通信的方法和通信装置
CN112800156A (zh) * 2021-01-06 2021-05-14 迪爱斯信息技术股份有限公司 一种基于路侧单元地图分幅方法、系统、设备和存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108605205A (zh) * 2016-01-29 2018-09-28 三星电子株式会社 用于确定电子装置的位置的设备和方法
CN111198890A (zh) * 2018-11-20 2020-05-26 北京图森智途科技有限公司 地图更新方法、路侧设备、车载装置、车辆和系统
US20190132709A1 (en) * 2018-12-27 2019-05-02 Ralf Graefe Sensor network enhancement mechanisms
WO2020253613A1 (zh) * 2019-06-17 2020-12-24 华为技术有限公司 通信的方法和通信装置
CN112800156A (zh) * 2021-01-06 2021-05-14 迪爱斯信息技术股份有限公司 一种基于路侧单元地图分幅方法、系统、设备和存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; Application layer support for Vehicle-to-Everything (V2X) services; Functional architecture and information flows; (Release 17)", 3GPP STANDARD; TECHNICAL SPECIFICATION; 3GPP TS 23.286, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, no. V17.2.0, 24 June 2021 (2021-06-24), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France , pages 1 - 105, XP052029590 *

Also Published As

Publication number Publication date
EP4379570A1 (en) 2024-06-05
CN115905431A (zh) 2023-04-04

Similar Documents

Publication Publication Date Title
US11593344B2 (en) Updating high definition maps based on age of maps
US11988518B2 (en) Updating high definition maps based on lane closure and lane opening
Zhang et al. Virtual traffic lights: System design and implementation
CN104732782A (zh) 发明智能在线式红绿灯及其智慧交通系统和方法
CN109003467A (zh) 一种防止车辆碰撞的方法、装置及系统
CN111216731B (zh) 一种车路协同自动驾驶的主动感知系统
CN102968909A (zh) 一种道路车辆拥堵远程智能识别系统及方法
CN112396856A (zh) 一种路况信息获取方法、交通标识牌及智能网联交通系统
WO2023040712A1 (zh) 地图数据处理方法和装置
WO2023155218A1 (en) A system and a method for reducing false alerts in a road management system
WO2024114414A1 (zh) 一种v2x事件的推送方法及装置
WO2023025061A1 (zh) 数据处理方法及装置
CN113044062B (zh) 一种智慧路灯无人驾驶控制方法、装置、设备及存储介质
CN113932828A (zh) 导航方法、终端、服务器、电子设备及存储介质
WO2023005636A1 (zh) 感知能力信息生成方法、使用方法及装置
US20240169831A1 (en) Method and device for providing traffic information
US20240011785A1 (en) Navigation Method, Apparatus, and System, and Map
WO2023137727A1 (zh) 智能驾驶功能或系统的控制方法及装置
WO2023024722A1 (zh) 通信能力信息生成方法、使用方法及装置
US20230316907A1 (en) Intersection-based offboard vehicle path generation
US20230316921A1 (en) Collision warning based on intersection information from map messages
US20230316912A1 (en) Path prediction based on intersection information from map messages
US20230316911A1 (en) Intersection-based map message generation and broadcasting
US20240029558A1 (en) Obstructed Lane Detection And Warning Method
US20240025435A1 (en) V2-based roll-over alert in an intersection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22860407

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022860407

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022860407

Country of ref document: EP

Effective date: 20240229

NENP Non-entry into the national phase

Ref country code: DE