WO2022082230A2 - Methods and apparatus for supporting autonomous vehicles in multi-edge computing systems - Google Patents

Methods and apparatus for supporting autonomous vehicles in multi-edge computing systems Download PDF

Info

Publication number
WO2022082230A2
WO2022082230A2 PCT/US2022/017053 US2022017053W WO2022082230A2 WO 2022082230 A2 WO2022082230 A2 WO 2022082230A2 US 2022017053 W US2022017053 W US 2022017053W WO 2022082230 A2 WO2022082230 A2 WO 2022082230A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor data
sensor
data
node
traffic
Prior art date
Application number
PCT/US2022/017053
Other languages
French (fr)
Other versions
WO2022082230A9 (en
WO2022082230A3 (en
Inventor
Jiafeng ZHU
Zongfang LIN
Original Assignee
Futurewei Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futurewei Technologies, Inc. filed Critical Futurewei Technologies, Inc.
Priority to PCT/US2022/017053 priority Critical patent/WO2022082230A2/en
Publication of WO2022082230A2 publication Critical patent/WO2022082230A2/en
Publication of WO2022082230A9 publication Critical patent/WO2022082230A9/en
Publication of WO2022082230A3 publication Critical patent/WO2022082230A3/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle

Definitions

  • the present disclosure relates generally to methods and apparatus for digital computing and communications, and, in particular embodiments, to methods and apparatus for supporting autonomous vehicles (AVs) in multi-edge computing (MEC) systems.
  • AVs autonomous vehicles
  • MEC multi-edge computing
  • Multi-access edge computing generally offers a new computing model that extends the capability of what is ordinarily available in single devices, thereby enabling resource-limited devices to not only perform tasks ordinarily outside of their capabilities, but also realize new tasks while keeping up with ever increasing demands.
  • autonomous vehicles support a wide range of applications, such as applications for tracking, autonomous driving, safety, navigation, machine learning, machine vision, voice recognition, gesture recognition, and so forth.
  • applications for these applications to operate at their fullest potential, accurate and timely sensor data from sensors located on the AVs is needed.
  • the sensors may fail (for example, a sensor may be damaged when the AV bumps into an object, or a sensor may be rendered inoperative by weather conditions or by road dirt accumulating on the sensor, and so on). Inaccurate or untimely sensor data may prevent some of the applications executing in the AV from operating properly, or even more dangerously, the applications may execute in an erroneous fashion. Therefore, there is a need for methods and apparatus for supporting AVs in MEC systems.
  • a method implemented by a first multi-access edge computing (MEC) node includes: receiving mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area; receiving traffic data and weather data of the geographical area; generating real-time road, traffic and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data; generating a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographic area in accordance with the real-time road, traffic, and weather models; and sharing, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
  • MEC multi-access edge computing
  • the mobile sensor data being received from the at least one AV.
  • the mobile sensor data being received from a second AV within the geographic area.
  • the mobile sensor data being received from one or both of the at least one AV and a second AV within the geographic area.
  • the traffic data and the weather data being received from at least one of sensors or information services.
  • the method further comprises sharing, with the at least one AV, the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
  • the sharing the real-time road, traffic, and weather models comprising sharing a portion of the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
  • the method further comprises sharing, with a second MEC node, the 3D map in accordance with geolocation information associated with the second MEC node.
  • the method further comprises sharing, with the second MEC node, one or more of: the real-time road, traffic, and weather models, in accordance with the geolocation information associated with the second MEC node.
  • the sharing the 3D map comprising sharing a portion of the 3D map in accordance with the geolocation information associated with the at least one AV.
  • the method further comprises: receiving an update of at least one of: the fixed sensor data, the mobile sensor data, the traffic data, or the weather data; updating the 3D map in accordance with the update; and sharing, with the at least one AV, the updated 3D map in accordance with the geolocation information associated with the at least one AV.
  • the method further comprises determining a presence of a faulty mobile sensor in the one or more mobile sensors or a faulty fixed sensor in the one or more fixed sensors, and based thereon, replacing sensor data associated with the faulty mobile sensor or the faulty fixed sensor with sensor data associated with a sensor within a specified distance from the faulty mobile sensor or the faulty fixed sensor.
  • the method further comprises receiving a request from a first AV connected with the MEC node, requesting the MEC node provide sensor data in place of sensor data of a faulty sensor of the first AV.
  • the method further comprises determining a fixed sensor or a mobile sensor whose sensor data is to be used in place of the sensor data of the faulty sensor of the first AV.
  • a multi-access edge computing (MEC) node includes: a non-transitoiy memoiy storing instructions; and at least one processor in communication with the memory, the at least one processor configured, upon execution of the instructions, to: receive mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area; receive traffic data and weather data of the geographical area; generate real-time road, traffic, and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data; generate a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic and weather models; and share, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
  • AV autonomous vehicle
  • the mobile sensor data being received from the at least one AV.
  • the mobile sensor data being received from a second AV within the geographic area.
  • the mobile sensor data being received from one or both of the at least one AV and a second AV within the geographic area.
  • the traffic data and the weather data being received from at least one of sensors or information services.
  • the instructions causing the MEC node to share, with the at least one AV, the real-time road, traffic and weather models in accordance with the geolocation information associated with the at least one AV.
  • the instructions causing the MEC node to share a portion of the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
  • the instructions further causing the MEC node to share, with a second MEC node, the 3D map in accordance with geolocation information associated with the second MEC node.
  • the instructions further causing the MEC node to share, with the second MEC node, one or more of: the real-time road, traffic, and weather models, in accordance with the geolocation information associated with the second MEC node.
  • the 3D map comprising information of the at least one AV.
  • the instructions causing the MEC node to: determine a presence of a faulty mobile sensor in the one or more mobile sensors or a faulty fixed sensor in the one or more fixed sensors; and based thereon, replace sensor data associated with the faulty mobile sensor or the faulty fixed sensor with sensor data associated with a sensor within a specified distance from the faulty mobile sensor or the faulty fixed sensor.
  • the instructions further causing the MEC node to receive a request from a first AV connected with the MEC node, the request requesting the MEC node provide sensor data in place of sensor data of a faulty sensor of the first AV.
  • the instructions further causing the MEC node to determine a fixed sensor or a mobile sensor whose sensor data is to be used in place of the sensor data of the faulty sensor of the first AV.
  • a first multi-access edge computing (MEC) node includes: a messaging agent configured to communicate messages with a second MEC node, configured to communicate messages with at least one autonomous vehicle (AV) operating within coverage of the first MEC node, configured to receive sensor data from sensors in a geographical area, and configured to receive data from information services; a modeling service operatively coupled to the messaging agent, the modeling service configured to generate real-time road, traffic, and weather models of the geographical area in accordance with the sensor data and the data received from the information services; and a fusion service operatively coupled to the modeling service, the fusion service configured to generate a three- dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic, and weather models.
  • AV autonomous vehicle
  • the fusion service configured to validate sensor data received from mobile sensors and fixed sensors in the geographical area, track AVs operating within the coverage of the first MEC node, and maintain a connection state with the at least one AV.
  • the messaging agent configured to receive updated sensor data or updated data from the information services.
  • the modeling service configured to update the real-time road, traffic, or weather models in accordance with the updated sensor data or the updated data from the information services.
  • the fusion service configured to update the 3D map in accordance with the updated real-time road, traffic, or weather models.
  • the messaging agent configured to share the updated 3D map in accordance with geolocation information associated with the at least one AV.
  • the messaging agent configured to share, with the second MEC node, the updated 3D map in accordance with geolocation information associated with the second MEC node.
  • the messaging agent configured to share, with the second MEC node, the updated real-time road, traffic, or weather models in accordance with geolocation information associated with the second MEC node.
  • An advantage of the disclosed embodiments is that a multi-edge computing (MEC) system uses data from a distributed system of sensors and information services to provide accurate and timely information to autonomous vehicles (AVs), enabling the proper and safe execution of applications by the AVs.
  • MEC multi-edge computing
  • Figure 1 is a diagram of an autonomous vehicle (AV) highlighting different types of sensors deployed on an AV;
  • AV autonomous vehicle
  • Figure 2 is a diagram of the AV highlighting a faulty radar sensor
  • Figure 3 is a diagram of a portion of a smart road
  • Figure 4 is a diagram of a system highlighting a resource pool used to provide support for the AV operation according to example embodiments presented herein;
  • Figure 5A is a diagram of the AV and dynamic traffic road management (DTRM) nodes providing computational resources or data processing to the AV at a first time instance according to example embodiments presented herein;
  • DTRM dynamic traffic road management
  • Figure 5B is a diagram of the AV and the DTRM nodes providing computational resources or data processing to the AV at a second time instance according to example embodiments presented herein;
  • Figure 6 is a diagram of a smart road deployment with a DTRM system providing AV operation support according to example embodiments presented herein;
  • Figure 7 is a diagram of a portion of a DTRM system deploying an autonomous driving support system, highlighting a DTRM node, services implemented at the DTRM node for supporting AVs, and AVs according to example embodiments presented herein;
  • Figure 8 is a diagram of communication between and operations performed by participants involved in autonomous driving or providing support for autonomous driving according to example embodiments presented herein;
  • Figure 9 is a flow diagram of example operations occurring in a DTRM node supporting AVs according to example embodiments presented herein;
  • Figure 10 is a flow diagram of example operations occurring in a DTRM node updating models and the 3D map according to example embodiments presented herein;
  • Figure 11 is a flow diagram of embodiment operations for fault sensor detection
  • Figure 12 is a flow diagram of embodiment operations for fault sensor detection.
  • Figure 13 is a block diagram of a computing system according to an embodiment.
  • FIG 1 is a diagram too of an autonomous vehicle (AV) 105 highlighting different types of sensors deployed on the AV 105.
  • AV 105 is on a road no, along with vehicles 115-118.
  • vehicles 115-118 are AVs, while the remainder is/are not.
  • AV 105 includes a plurality of sensors, including cameras (optical wavelength, infrared wavelength, stereo, etc.), radars (short range, long range, etc.), light detection and ranging (LiDAR) sensors, and so on.
  • the sensors may be deployed at different locations of AV 105.
  • a first camera and a stereo camera may be deployed at the front end of AV 105, while a second camera may be deployed at the rear end of AV 105, and radars may be deployed at the four corners of AV 105.
  • Figure 1 illustrates example coverage areas of the sensors deployed in AV 105.
  • sensors deployed in an AV are robust and are well protected. However, the sensors may become damaged in an accident.
  • a sensor deployed in the bumper of an AV may be damaged if the AV bumps into a wall during parking or strikes an animal during driving.
  • dirt and debris from the road may also impact sensor function.
  • dirt, mud, or ice may collect on a sensor and negatively impact sensor function.
  • inclement weather may reduce the effectiveness of the sensor.
  • heavy rain, snow, or fog may reduce the visual acuity of cameras.
  • FIG. 2 is a diagram 200 of AV 105 highlighting a faulty radar sensor 205.
  • Faulty radar sensor 205 may arise from physical damage to the radar sensor 205. Alternatively, dirt or mud may have accumulated on the radar sensor 205, negatively impacting the performance of the radar sensor 205.
  • region 210 a portion (shown in diagram 200 as region 210) of the sensor range of AV 105 is missing or inaccurate.
  • AV 105 may incapable of detecting pedestrians, vehicles, or other obstructions within region 210. Because of the missing or inaccurate sensor information within region 210, AV 105 may not be able to operate in a fully autonomous manner, or AV 105 may require driver intervention to operate safely. This defeats the concept of autonomous driving.
  • FIG. 3 is a diagram 300 of a portion of a smart road 305.
  • the portion of smart road 305 is of an intersection of two roads.
  • Smart road 305 includes sensors 310-317, which may be magnetic sensors, optical sensors, cameras, speed detection devices, weather sensors, temperature sensors, etc., or any combination thereof. Additional sensors can be used to generate information for AVs, and such additional sensors are within the scope of this specification and claims.
  • sensors 310-317 which may be magnetic sensors, optical sensors, cameras, speed detection devices, weather sensors, temperature sensors, etc., or any combination thereof. Additional sensors can be used to generate information for AVs, and such additional sensors are within the scope of this specification and claims.
  • Operating on smart road 305 there may be a mix of AVs 320-329 and non-AV vehicles 330-331.
  • a first AV may use information from a second AV that is near the first AV to supplement information from a failed or impaired sensor in the first AV.
  • the first AV may obtain the information of the second AV directly through a connection established between the first and the second AVs, or indirectly through a relay, a centralized process system or server, or a distributed processing system.
  • AV 320 has several faulty sensors, then it may be possible to utilize sensor data from AV 321 and AV 325 to supplement missing information from the failed sensors of AV 320.
  • information from multiple AVs, as well as sensors of a smart road may be aggregated to augment the autonomous driving of the AVs operating in the vicinity of the sensors and AVs.
  • sensor data from AVs 320-328 and sensors 310-317 are aggregated to improve the autonomous driving performance of AVs 320-328.
  • the computational requirements may exceed the capabilities of a single AV or centralized computing system. Therefore, there is a need for methods and apparatus for supporting AVs in distributed computing systems.
  • a distributed computing system e.g., a multi-edge computing (MEC) system, supporting AVs
  • MEC multi-edge computing
  • a MEC system for supporting AVs may be referred to as a dynamic traffic road management (DTRM) system.
  • DTRM dynamic traffic road management
  • the DTRM system provides support in terms of data or information support, as well as computation support.
  • the DTRM system may be utilized to reduce the computational resource requirements of an AV, as well as data storage and movement requirements of the AV.
  • the DTRM system may provide data or information support in terms of aggregating data or information from multiple sources and then providing the aggregated data or information to an AV.
  • the amount of data being delivered to the AV may exceed the storage capabilities of the AV.
  • the DTRM system may perform data processing and deliver only the data that is relevant to the AV, thereby reducing the storage and computational requirements of the AV.
  • the DTRM system may provide computational resource support to an AV in a situation where the AV requires more computation resources than available on the AV.
  • the processing of the data intended for an AV may in some instances exceed the computational resources available at the AV (while ensuring sufficient computational resources remain available to ensure safe autonomous vehicle operation). In such a situation, the DTRM system may perform at least a portion of the computation and provide the results of the computation to the AV, thereby freeing precious computational resources at the AV.
  • the DTRM system provides data handling and processing support for AVs.
  • the DTRM system aggregates sensor data and information from information services and provides the aggregated information to the AVs.
  • the aggregated information may be provided to the AVs in a raw (e.g., unprocessed) form or in a processed form.
  • the aggregated information may be provided to the AVs based on the location of the AVs.
  • an AV is provided aggregated information that is with a specified distance from the AV or at an expected (or predicted) location of the AV.
  • the DTRM system provides sensor data and information that is ordinarily unavailable at the AV to ensure safe autonomous operation by augmenting the data and information available at the AV.
  • the AV can make use of the provided sensor data to improve autonomous operation.
  • the DTRM system provides processing support for AVs.
  • the DTRM system performs computational operations on aggregated sensor data and information from information services to develop models or maps to help the AV operate autonomously in an efficient manner.
  • the DTRM system combines sensor data (from mobile sensors in AVs and fixed sensors deployed in and around the smart road) and information from information services to develop models or maps of the traffic, roads, weather, etc., using available computational resources to help the AV operate autonomously in a situation when the AV has a faulty or impaired sensor.
  • sensor data associated with a faulty or impaired sensor is replaced with sensor data of an alternate sensor located near the faulty or impaired sensor. If the alternate sensor is located within a specified distance of the faulty or impaired sensor and if the alternate sensor and the faulty or impaired sensor are of the same sensor type, the sensor data of the alternate sensor may be a suitable substitute for the faulty or impaired sensor. In an embodiment, sensor data from a plurality of alternate sensors may be used as a substitute for the sensor data from the faulty or impaired sensor. In general, larger numbers of alternate sensors may yield better performance. However, too many alternate sensors may unnecessarily increase the amount of sensor data to process and needlessly increase the computational requirements without significantly improving performance.
  • FIG 4 is a diagram of a system 400 highlighting a resource pool used to provide support for AV 405 operation.
  • AV 405 is in motion along route 406 and is processing sensor data from sensors 407 and information services 408.
  • AV 405 is in coverage of a DTRM system.
  • AV 405 may leave the coverage of a first antenna 410 and enter the coverage of a second antenna 411 or a third antenna 412. In some situations, AVs may be in the coverage of multiple antennas.
  • antennas 410, 411, and 412 make up a radio access network (RAN) 413 providing connectivity for AV 405 with a DTRM system 414.
  • RAN radio access network
  • a resource pool 415 represents the available DTRM nodes (MEC nodes) of DTRM sites (MEC sites) of DTRM system 414 that are capable of communicating with AV 405 as AV 405 moves along route 406.
  • a DTRM site corresponds to a coverage area or service area within which MEC services are provided, and may include one or more DTRM nodes providing the MEC services, such as data/information processing and storage described above.
  • a DTRM site may be configured with a DTRM system.
  • Each DTRM node may be viewed as a data center or an edge cloud deployed closely to users.
  • Each DTRM node may provide computing resources, memoiy resources, storage resources, etc., for executing applications in virtual machines (VMs) or in containerized environments, such as containers and dockers.
  • VMs virtual machines
  • containerized environments such as containers and dockers.
  • each DTRM node has considerably more resources (e.g., multiple multi-core processors, multiple gigabytes of memoiy, high bandwidth connectivity, etc.) than devices of users.
  • resource pool 415 includes available DTRM nodes that are capable of communicating with AV 405 within a particular time window.
  • resource pool 415 may exclude DTRM nodes that are capable of communicating with AV 405, but only at a significant time in the future. Excluding such DTRM nodes may be advantageous because AV 405 may no longer be on route 406 so far in the future and planning for such a possibility may be a waste of resources. Furthermore, uncertainty typically increases with time, and hence planning so far into the future may yield inaccurate results.
  • resource pool 415 may include a subset of the DTRM nodes of DTRM sites 416, 417, 418, and 419.
  • DTRM side 419 may be a redundant computer resource interconnecting DTRM sites 416, 417 and 418, and work as a backup computing resource. It may also be a high level computing node that help coordinate computing between DTRM sites 416, 417 and 418.
  • a DTRM node may be a part of resource pool 415 for AV 405 if the DTRM node has sufficient computation resources to provide computational resources or data processing to AV 405 while AV 405 is within coverage of a DTRM site including the DTRM node. If the DTRM node has insufficient computational resources or cannot provide data processing to AV 405 while AV 405 is within coverage of the DTRM site associated with the DTRM node, the DTRM node is not a member of resource pool 415.
  • Figure 5A is a diagram 500 of AV 505 and DTRM nodes providing computational resources or data processing to AV 505 at a first time instance.
  • DTRM nodes 511, 512, 513, and 514 are providing computational resources or data processing to AV 505 (traversing route 506), while other DTRM nodes, such as DTRM nodes 510 and 515 (shown as clear circles) are not chosen to provide computational resources or data processing (or at least not yet providing computational resources or data processing) to AV 505.
  • DTRM nodes 511, 512, 513, and 514 are providing computational resources or data processing to AV 505 (traversing route 506)
  • other DTRM nodes such as DTRM nodes 510 and 515 (shown as clear circles) are not chosen to provide computational resources or data processing (or at least not yet providing computational resources or data processing) to AV 505.
  • DTRM nodes 510 and 515 are not chosen to provide computational resources or data processing (or at least not yet providing computational
  • DTRM nodes 511, 512, 513, and 514 may be DTRM nodes that are within a time window of a current location of AV 505.
  • the DTRM nodes that are within the time window may be expected to be serving AV 505 or will be serving AV 505 with a certain probability.
  • the size of the time window may vaiy based on factors including environmental factors, traffic information, emergency information, etc. In general, the time window should be sufficiently large to enable efficient scheduling of multiple DTRM nodes at any one time, however, the time window should not be too large else the uncertainty associated with mobility of AV 505 may result in too many scheduled jobs and tasks having to be rescheduled.
  • the number of DTRM nodes providing computational resources or data processing to AV 505 may be a pre-specified value.
  • the number of DTRM nodes is four, although other numbers may be prespecified.
  • the number of DTRM nodes may be related to the time window. As an example, if AV 505 is moving rapidly, the number of DTRM nodes may be greater, while if AV 505 is moving slowly, the number of DTRM nodes may be lesser.
  • Figure 5B is a diagram 550 of AV 505 and DTRM nodes providing computational resources or data processing to AV 505 at a second time instance.
  • the second time instance occurs later in time than the first time instance.
  • DTRM nodes 512, 513, 514, and 515 are within the time window and are providing computational resources or data processing to AV 505 (traversing route 506), while other DTRM nodes, such as DTRM nodes 510, 511, and 516 are not chosen to provide computational resources or data processing (or at least not yet providing computational resources or data processing) to AV 505.
  • the geolocation information of an AV is used to control the support provided to the AV by the DTRM system.
  • the geolocation information of an AV includes the current location of the AV, the estimated location of the AV at a future time, the velocity of the AV, the direction of the AV, the destination of the AV, the projected path of the AV, and so on.
  • the geolocation information of the AV may be determined in accordance with location information provided by a Global Navigation Satellite System (GNSS) (such as Global Positioning System (GPS), Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS), Galileo, BeiDou Navigation Satellite System (BDS), etc.), cellular communication system based location measurement information, velocity information, and so on.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • GLONASS Globalnaya Navigatsionnaya Sputnikovaya
  • BDS BeiDou Navigation Satellite System
  • the geolocation information may be used to determine which parts of the DTRM system may be used to provide support to the AV. Additionally, the geolocation information may be used to determine which type of support is provided to the AV.
  • one or more DTRM nodes of the DTRM system are assigned to support the AV in accordance with the geolocation information of the AV.
  • the one or more DTRM nodes may change as the AV moves along its route, e.g., a first DTRM node may be unassigned from providing support to the AV as the AV exits the coverage of the first DTRM node, while a second DTRM node may be assigned to provide support to the AV as the AV enters the coverage of the second DTRM node.
  • the geolocation information of an AV is used to control the augmentation of sensor data used to support the operation of the AV.
  • the geolocation information may be used to identify the sensor data from specific sensors that may be used to augment the sensor data provided by the sensors of the AV.
  • the geolocation information of the AV is used to identify AVs that are within a specified distance (i.e., "close") of the AV, as well as fixed sensors of the smart road, that may provide sensor data usable in augmenting the sensor data of the AV.
  • the sensor data from the AV, along with the sensor data from the identified AVs and fixed sensors, may be processed to support the operation of the AV.
  • information from information services is also identified by the geolocation information of the AV and used to support the operation of the AV.
  • the information from the information services may include topography information, geography information, navigation system information, traffic information, emergency services information, weather information, and so on.
  • a service such as autonomous driving, may be partitioned into tasks and mapped onto MEC nodes (e.g., DTRM nodes) of MEC sites (e.g., DTRM sites) based on the requirements of the service and geolocation information of the AV. The partitioning and mapping are performed so that the service is continuous while the AV moves along its route.
  • MEC nodes e.g., DTRM nodes
  • MEC sites e.g., DTRM sites
  • sensor data that is missing or unreliable due to a faulty or damaged sensor is replaced by sensor data from functional sensors (that are functioning normally) in accordance with the geolocation information associated with the faulty or damaged sensor.
  • the sensor data from the faulty or damaged sensor may be replaced with sensor data from a functional sensor in a situation where the faulty or damaged sensor and the functional sensor are in close proximity (e.g., the location of the faulty or damaged is within a specified threshold of the location of the functional sensor).
  • the specified threshold is a predefined value or a value that can be dynamically adjusted based on operating conditions.
  • the specified threshold when the operating condition is poor (e.g., rain, snow, sleet, fog, dark, etc.), the specified threshold is small, while when the operating condition is good (e.g., clear sky, day, no fog, etc.), the specified threshold is large.
  • the operating condition e.g., rain, snow, sleet, fog, dark, etc.
  • the specified threshold when the operating condition is good (e.g., clear sky, day, no fog, etc.), the specified threshold is large.
  • the faulty or damaged sensor and the functional sensor are of the same sensor type. In another embodiment, the faulty or damaged sensor and the functional sensor have the same or similar sensing range. In other words, the sensor data from the faulty or damaged sensor and from the functional sensor should cover approximately the same area or the sensing range of the functional sensor is a superset of the sensing range of the faulty or damaged sensor. The functional sensor should "see" about the same or more than the faulty or damaged sensor.
  • the sensor data from the faulty or damaged sensor is replaced with sensor data from a plurality of functional sensors.
  • the sensor data from the plurality of functional sensors may be used in place of the sensor data from the faulty or damaged sensor.
  • the specified threshold may be relaxed when there are multiple functional sensors. In general, the amount of relaxation of the specified threshold is dependent on the number of functional sensors.
  • FIG. 6 is a diagram of a smart road deployment 600 with a DTRM system providing AV operation support.
  • Smart road deployment 600 supports the operation of a plurality of AVs, including AVs 605-609.
  • Smart road deployment 600 includes a DTRM system with a plurality of DTRM sites, including DTRM sites 610, 612, and 614, with coverage areas 611, 613, and 615, respectively.
  • Each AV includes one or more sensors (SNSRs), which are referred to as mobile sensors due to their location on the AVs. Sensor data from the mobile sensors may be utilized by the AV on which they are deployed. The sensor data may be communicated to the DTRM system, where they may be used to augment the operation of the AVs.
  • Smart road deployment 600 also includes sensors (such as sensors 620-628) that are deployed on or in the roads, in or on buildings, on lights, on signs, and so on. These sensors are referred to as fixed sensors due to their being immobile (when compared to the mobile sensors of AVs). Sensor data from the fixed sensors may be communicated to the DTRM system, where they can be used to augment the operation of the AVs.
  • sensor data from sensor 623 may be used to replace sensor data from a faulty sensor located on AV 605, while sensor data from sensors 623 and 625 may help support the operation of AV 606.
  • AV 606 is in coverage area 615 of DTRM site 614, while sensor 623 is in the coverage area 611 of DTRM site 610.
  • sensor data from sensor 623 may be shared with DTRM site 614.
  • sensor data from AV 606 may be used to replace sensor data from a faulty sensor located in AV 605.
  • sensor data from mobile sensors or fixed sensors may be used to replace sensor data from faulty or damaged sensors.
  • the sensor data may be utilized in one or more DTRM sites to help support the operation of AVs with faulty or damaged sensors.
  • sensor data from fixed sensor 627 may be used to replace or augment sensor data from a faulty or damaged sensor of AV 607.
  • the sensor data may also be used to support the operation of AVs without any faulty or damaged sensors.
  • the sensor data may be used to enhance the operation of fully operational AVs.
  • the operation of AVs may be improved, e.g., in safety, speed, and so on.
  • sensor data from fixed sensors 626 and 628 may be used to help improve the operation of AV 608.
  • the examples presented herein are intended for illustrative purposes and are not intended to limit the scope of the example embodiments.
  • services are deployed at DTRM nodes of a DTRM system to support the operation of AVs.
  • Deploying services at DTRM nodes implements a distributed system for supporting the operation of AVs.
  • the distributed implementation helps to reduce the overhead associated with communicating the sensor data from mobile and fixed sensors deployed in the DTRM system, as well as distributing information usable in assisting the AVs throughout the DTRM system.
  • the distributed implementation facilitates a level of fault tolerance and redundancy that is important in a time-critical application, such as autonomous driving.
  • Figure 7 is a diagram of a portion of a DTRM system 700 deploying an autonomous driving support system, highlighting a DTRM node 705, services implemented at DTRM node 705 for supporting AVs, and AVs 710-712.
  • DTRM node 705 is connected to other DTRM nodes of DTRM sites of DTRM system 700.
  • AVs 710-712 may have subscriptions with DTRM system 700, and may also be referred to as subscribed AVs.
  • DTRM node 705 is connected to AVs 710-712, providing support for autonomous driving by AVs 710-712.
  • DTRM node 705 may begin to support a new AV when the new AV enters the coverage area of DTRM node 705, and stop supporting the new AV when the new AV exits the coverage area of DTRM node 705.
  • a DTRM node it will enter the coverage area of another DTRM node.
  • the AV may stop receiving autonomous driving support unless another DTRM system is available and the AV has a subscription with the other DTRM system.
  • DTRM node 705 includes a messaging service 720 that is configured to support communications for DTRM node 705.
  • Messaging service 720 processes messages sent from and received by DTRM node 705.
  • Messaging service 720 includes a message agent 722 that is configured to provide a messaging service.
  • Message agent 722 may be configured to establish, manage, and control a messaging flow and message exchanges associated with the messaging flow.
  • Messages exchanged may include information of a mobile device (e.g., devices in an AV), a user, an account of the mobile device or the user, a service associated with an application, security, priority, data to be processed or that has been processed, policy, regional map, flow table, states of DTRM nodes, scheduling information, tasks/jobs, and any other information related to execution of a service provided to an AV.
  • a mobile device e.g., devices in an AV
  • a user e.g., devices in an AV
  • an account of the mobile device or the user e.g., a service associated with an application
  • security e.g., priority
  • policy e.g., regional map, flow table, states of DTRM nodes, scheduling information, tasks/jobs, and any other information related to execution of a service provided to an AV.
  • message agent 722 may receive a messaging service request, e.g., from another DTRM node or from an AV, verify the messaging service request, establish and tear down a messaging flow, create a virtual messaging network, generate and assign a messaging flow ID, assist in the performing of planning and scheduling of tasks and jobs, distribute scheduling information, distribute the tasks and jobs, deliver to-be-processed and processed data, and maintain a regional map and flow table.
  • Messaging service 720 also includes an alarm message agent 724 that is configured to process alarm messages. The alarm messages may be used to inform an AV of a potentially dangerous situation, for example. The alarm messages may also be used to alert DTRM node 705 of a potentially dangerous situation.
  • the message agents in the AVs support communication by the AV.
  • DTRM node 705 includes a traffic-road (TR) modeling service 730 that is configured to generate and maintain models of the roads, traffic, and weather.
  • TR modeling service 730 makes use of sensor data (from mobile sensors and fixed sensors, for example), as well as information from information services (e.g., weather services, traffic services, emergency services, road condition services, map services, and so on).
  • TR modeling service 730 includes a road model agent 732 that is configured to generate and maintain a road model using sensor data and information from information services.
  • road model agent 732 uses mapping information from map services, in conjunction with sensor data, road model agent 732 generates a road model of the roads covered by DTRM system 700, highlighting road construction, road damage, road condition (e.g., icy, slick, wet, etc.), and so on.
  • Road model agent 732 also updates the road model based on changes in the sensor data and the information from information services. As an example, a formerly icy road may no longer be icy, road construction may have completed at one location, while road construction has started at another.
  • Road model agent 732 provides the road model to the AVs connected to DTRM node 705.
  • the road model provided to a particular AV is in accordance with the geolocation information of the AV.
  • the specified distance may be a distance that the AV is expected to travel within a given time duration, for example.
  • Providing only the portion of the road model to the AV helps reduce the amount of data communicated to the AVs.
  • the road model may also be provided to other DTRM nodes, such as DTRM nodes that are within a specified distance from DTRM node 705.
  • TR modeling service 730 includes a weather model agent 734 that is configured to generate and maintain a weather model using sensor data and information from information services.
  • weather model agent 734 uses weather information from weather services, in conjunction with sensor data, weather model agent 734 generates a weather model for the roads covered by DTRM system 700, highlighting weather conditions of roads covered by DTRM system 700.
  • the weather model indicates rain, snow, ice, fog, etc., for the roads covered by DTRM system 700.
  • Weather model agent 734 also updates the weather model based on changes in the sensor data and the information from information services.
  • weather model agent 734 refines the weather condition of the roads based on sensor data received from mobile and fixed sensors throughout DTRM system 700.
  • weather model agent 734 may set up a message channel with message agent 722, and receive updated data from surrounding mobiles and fixed sensors in DTRM system 700.
  • Weather model agent 734 may fuse all received data based on its weather model, and refine the weather model to improve the accuracy of the weather model.
  • Weather model agent 734 provides the weather model to the AVs connected to DTRM node 705.
  • the weather model provided to a particular AV is in accordance with the geolocation information of the AV.
  • the specified distance may be a distance that the AV is expected to travel within a given time duration, for example.
  • Providing only the portion of the weather model to the AV helps to reduce the amount of data communicated to the AVs.
  • the weather model may also be provided to other DTRM nodes, such as DTRM nodes that are within a specified distance from DTRM node 705.
  • TR modeling service 730 includes a traffic model agent 736 that is configured to generate and maintain a traffic model using sensor data and information from information services.
  • traffic model agent 736 uses traffic information, emergency information, in conjunction with sensor data, traffic model agent 736 generates a traffic model for the roads covered by DTRM system 700, highlighting traffic on the roads.
  • traffic model indicates traffic congestion, flow, speeds, etc., for the roads covered by DTRM system 700.
  • Traffic model agent 736 also updates the traffic model based on changes in the sensor data and the information from information services.
  • traffic model agent 736 alters the traffic condition of the roads based on the sensor data and the information from information services.
  • Traffic model agent 736 provides the traffic model to the AVs connected to DTRM node 705.
  • the traffic model provided to a particular AV is in accordance with the geolocation information of the AV.
  • the specified distance may be a distance that the AV is expected to travel within a given time duration, for example.
  • Providing only the portion of the traffic model to the AV helps to reduce the amount of data communicated to the AVs.
  • the traffic model may also be provided to other DTRM nodes, such as DTRM nodes that are within a specified distance from DTRM node 705.
  • the TR modeling service is also configured to generate a three-dimensional (3D) map based on the traffic mode, weather model and road model.
  • the 3D map may show static objects and dynamic moving objects in a 3-D space, and may show/ indicate weather conditions, traffic lights, traffic incidents, and so on.
  • the 3D map may be provided to AVs connected to DTRM node 705.
  • the 3D map provided to a particular AV is in accordance with the geolocation information of the AV.
  • only a portion of the 3D map that is within a specified distance of the AV is provided to the AV, where the specified distance may be a distance that the AV is expected to travel within a given time duration, for example.
  • the 3D map is dynamic, and may be updated based on update of the traffic mode, weather model and road model.
  • DTRM node 705 includes a TR fusion service 740 that is configured to provide an interface for inputs from different sources (e.g., mobile sensors, fixed sensors, information services, etc.) with TR modeling service 730.
  • TR fusion service 740 communicates with TR modeling service 730 through messaging service 720.
  • TR fusion service 740 includes a connection management unit 742 that is configured to maintain AV connection state. Connection management unit 742 helps establish a connection with an AV as the AV enters the coverage area of DTRM node 705 and helps handover the AV as the AV exits the coverage area of DTRM node 705, for example.
  • TR fusion service 740 includes a fixed sensor management unit 744 that is configured to compare and validate sensor data from fixed sensors.
  • TR fusion service 740 also includes motion sensor management unit 746 that is configured to compare and validate sensor data from mobile sensors. Fixed sensor management unit 744 and motion sensor management unit 746 help ensure that sensor data from respective sensors are valid.
  • TR fusion service 740 includes a sensor fusion unit 748 that is configured to combine sensor data from multiple sources (e.g., multiple mobile sensors from multiple AVs, multiple fixed sensors, a combination of mobile sensors and fixed sensors, etc.). Sensor fusion unit 748 may make use of information from fixed sensor management unit 744 and motion sensor management unit 746 in the combining of the sensor data. As an example, sensor fusion unit 748 eliminates sensor data from faulty or damaged sensors and replaces the sensor data with sensor data from other sensors that are within a specified threshold of the faulty or damaged sensors.
  • sensor fusion unit 748 eliminates sensor data from faulty or damaged sensors and replaces the sensor data with sensor data from other sensors that are within a specified threshold of the faulty or damaged sensors.
  • TR fusion service 740 includes a vehicle tracker 750 that is configured to maintain and update AV location in the dynamic three-dimensional (3D) map.
  • vehicle tracker 750 obtains the location of the AVs (e.g., from geolocation information of the AVs) and updates the dynamic 3D map.
  • TR fusion service 740 also includes a dynamic 3D map unit 752 that is configured to maintain and update the dynamic 3D map in accordance with sensor data and information from information services.
  • dynamic 3D map unit 752 updates the road conditions, weather conditions, traffic conditions, etc., based on the sensor data and information from information services received by TR fusion service 740.
  • the updated dynamic 3D map may be provided to subscribers and other DTRM nodes.
  • FIG. 8 is a diagram 800 of communication between and operations performed by participants involved in autonomous driving or providing support for autonomous driving.
  • the participants involved in autonomous driving or providing support for autonomous driving include AV 805, messaging agent 807, DTRM node 809, and authentication & policy unit 811.
  • Authentication & policy unit 811 is an entity that may be a part of DTRM node 809, a DTRM system including DTRM node 809, or an infrastructure system that is used to authenticate and authorize participants (e.g., AV 805), to ensure that the participants have a valid subscription, for example.
  • authentication & policy unit 811 may be an authentication, authorization, and accounting (AAA) function of a Third Generation Partnership Project (3GPP) compliant network.
  • AAA authentication, authorization, and accounting
  • AV 805 sends a connection request to messaging agent 807 of DTRM node 809 (event 815).
  • AV 805 sends the connection request to initiate the connection process with DTRM node 809, for example.
  • AV 805 may send the connection request when it comes within coverage range of DTRM node 809.
  • the connection request may include an identifier of AV 805.
  • Messaging agent 807 forwards the connection request to DTRM node 809 (event 817).
  • the forwarded connection request may be identical to the connection request received from AV 805, or it may be altered.
  • DTRM node 809 sends an authentication request (event 819).
  • the authentication request may be sent to authentication & policy unit 811 and is a request to authenticate AV 805.
  • the authentication request checks to ensure that AV 805 is an authorized user of services of DTRM node 809.
  • Authentication & policy unit 811 performs a policy and quality of service (QoS) check (block 821).
  • the policy and QoS check may be a check of the authenticity of AV 805, for example.
  • the policy and QoS check may also be a check of the QoS level requested by AV 805. As an example, the policy and QoS check determines if AV 805 has a subscription sufficient to meet the QoS level.
  • Authentication & policy unit 811 sends an authentication confirmation (event 823).
  • the authentication confirmation is sent to DTRM node 809 and confirms whether or not AV 805 has been authenticated and meets the QoS level. For discussion purposes, for the remainder of the discussion of Figure 8, it is assumed that AV 805 has successfully authenticated and met the QoS level.
  • DTRM node 809 sends a tracking identifier (ID) for AV 805 (event 825).
  • ID tracking identifier
  • the tracking identifier is transmitted to messaging agent 807 and may be used to identify AV 805 while AV 805 is connected to DTRM node 809.
  • messages sent to AV 805 may be identified using the tracking identifier as the destination address.
  • messages sent by AV 805 may be identified using the tracking identifier as the source address.
  • Messaging agent 807 assigns a messaging flow identifier for AV 805 and sends the messaging flow identifier (event 827).
  • the messaging flow identifier is used to identify message flows from or to AV 805, for example.
  • the messaging flow identifier is sent to AV 805.
  • DTRM node 809 confirms the messaging connection (event 829).
  • AV 805 sends sensor data (event 831).
  • the sensor data from sensors of AV 805, is sent to messaging agent 807.
  • Messaging agent 807 sends the sensor data to DTRM node 809.
  • information from information services may be sent to DTRM node 809.
  • DTRM node 809 generates the traffic, weather, and road models (block 833).
  • the traffic, weather, and road models may be generated in accordance with the sensor data and the information from information services, for example.
  • DTRM node 809 generates a 3D map in accordance with the traffic, weather, and road models.
  • DTRM node 809 provides the models to AVs (event 835).
  • the traffic, weather, and road models may be sent to messaging agent 807, which forwards the models to AV 805.
  • the traffic, weather, and road models are sent in accordance with the geolocation information of AV 805.
  • the 3D map is provided to the AVs in accordance with the geolocation information of the AVs.
  • AV 805 sends updated sensor data (event 837).
  • sensors provide a stream of sensor data at specific intervals, e.g., a specified number of times per second.
  • AV 805 sends the updated sensor data, as generated by its sensors, to messaging agent 807.
  • Messaging agent 807 forwards the updated sensor data to DTRM node 809.
  • DTRM node 809 updates the 3D map and models (block 839).
  • the 3D map and models are updated in accordance with the updated sensor data and the information from information services, for example.
  • DTRM node 809 provides the updated 3D map and models to the AVs.
  • DTRM node 809 provides the updated 3D map and models to AV 805 in accordance with the geolocation information of AV 805.
  • AV 805 As AV 805 moves, AV 805 exits the coverage area of DTRM node 809. In such a situation, AV 805 participates in a handover with another DTRM node and the session (the connection) is passed to the other DTRM node (event 843). Because AV 805 is being served by another DTRM node, messaging agent 807 stops the DTRM service for AV 805 (event 845) and DTRM node 809 removes the tracking identifier associated with AV 805 (block 847) . Should AV 805 re-enter the coverage area of DTRM node 809, a new tracking identifier may be assigned to AV 805.
  • Figure 9 is a flow diagram of example operations 900 occurring in a DTRM node supporting AVs.
  • Operations 900 may be indicative of operations occurring in a DTRM node as the DTRM node supports AVs by generating traffic, weather, and road models, as well as 3D map, and providing the models and 3D map to the AVs.
  • the DTRM node augments sensor data from faulty or damaged sensors with sensor data from other sensors.
  • Operations 900 begin with the DTRM node receiving sensor data (block 905).
  • the sensor data may be received from AVs connected to the DTRM node.
  • the DTRM node may receive sensor data from fixed sensors in the coverage area of the DTRM node.
  • the DTRM node may receive sensor data from AVs outside of the coverage area of the DTRM node, as well as fixed sensors outside of the coverage area of the DTRM node.
  • the DTRM node receives information from information services (block 907).
  • the DTRM node receives information from services, such as topography information, geography information, navigation system information, traffic information, emergency services information, weather information, and so on.
  • the DTRM node optionally augments the sensor data (block 909).
  • the DTRM node augments the sensor data from the faulty or damaged sensors (or the missing sensor data from the faulty or damaged sensors) with sensor data from other sensors that are closely located to the faulty or damaged sensors.
  • the DTRM node augments the sensor data from AVs, for example, with sensor data from sensors that are closely located to the AVs.
  • additional sensor data helps improve the quality of support provided to the AVs. For example, additional sensor data improves the quality of the sensor information, potentially resulting in higher quality models and maps.
  • the DTRM node generates models (block 911).
  • the DTRM node generates the traffic, weather, and road models from the sensor data and the information from the information services.
  • the DTRM node shares the models (block 913).
  • the DTRM node shares information associated with the models to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV.
  • the DTRM node instead of providing all information of the models to the AVs, shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps to reduce the amount of information being shared.
  • the DTRM node generates a 3D map (block 915).
  • the 3D map is generated from the sensor data and the information from the information services.
  • the DTRM node shares the 3D map (block 917).
  • the DTRM node shares information associated with the 3D map to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV.
  • the DTRM node instead of providing all information of the 3D map to the AVs, shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps to reduce the amount of information being shared.
  • Figure 10 is a flow diagram of example operations 1000 occurring in a DTRM node updating models and the 3D map.
  • Operations 1000 may be indicative of operations occurring in a DTRM node as the DTRM node updates the traffic, weather, and road models, as well as the 3D map, and providing the models and 3D map to the AVs.
  • the DTRM node augments sensor data from faulty or damaged sensors with sensor data from other sensors.
  • Operations 1000 begin with the DTRM node receiving updated sensor data (block 1005).
  • the sensor data may be received from AVs connected to the DTRM node.
  • the DTRM node may receive updated sensor data from fixed sensors in the coverage area of the DTRM node.
  • the DTRM node may receive updated sensor data from AVs outside of the coverage area of the DTRM node, as well as fixed sensors outside of the coverage area of the DTRM node.
  • the DTRM node receives updated information from information services (block 1007).
  • the DTRM node receives updated information from services, such as topography information, geography information, navigation system information, traffic information, emergency services information, weather information, and so on.
  • the DTRM node optionally augments the updated sensor data (block 1009).
  • the DTRM node augments the updated sensor data from the faulty or damaged sensors (or the missing sensor data from the faulty or damaged sensors) with updated sensor data from other sensors that are closely located to the faulty or damaged sensors.
  • the DTRM node augments the updated sensor data from AVs, for example, with updated sensor data from sensors that are closely located to the AVs.
  • additional sensor data helps improve the quality of support provided to the AVs. For example, additional sensor data improves the quality of the sensor information, potentially resulting in higher quality models and maps.
  • the DTRM node updates the models (block ton).
  • the DTRM node updates the traffic, weather, and road models from the updated sensor data and the updated information from the information services.
  • the DTRM node shares the updated models (block 1013).
  • the DTRM node shares information associated with the updated models to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV.
  • the DTRM node shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps reduce the amount of information being shared.
  • the DTRM node updates the 3D map (block 1015).
  • the 3D map is updated from the updated sensor data and the updated information from the information services.
  • the DTRM node shares the updated 3D map (block 1017).
  • the DTRM node shares information associated with the updated 3D map to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV.
  • the DTRM node shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps to reduce the amount of information being shared.
  • the information about the models and/or the 3D maps shared with the AVs may be customizable for each AV based on their request, location, hardware configurations or connection bandwidth.
  • the DTRM node may constantly or periodically receive sensor data and information from the information services, and update the models and the 3D map.
  • the timing, continuation or frequency of compiling and supplying sensor data from sensors, and/or sharing modes and the 3D maps may be configurable.
  • the models and 3D map generated by a DTRM node may be used by the DTRM node, AVs (who are shared with the models and 3D map) or other DTRM nodes (e.g., a neighboring DTRM node exchanging modes and 3D maps with the DTRM node) in various applications.
  • a DTRM node may track traffic situations of an area and provide timely traffic reports (e.g., traffic jam), accidents reports (e.g., car accident), and incident reports (e.g., road construction, traffic sign issues, road conditions, etc.).
  • a DTRM node may, based on the models and 3D map, may detect abnormality of an AV, and/or a sensor (a mobile sensor or a fixed sensor). For example, other cars are moving but one car is still, or traffic light is red but one sensor shows green, etc.
  • An AV may use the models and 3D map to “see” images or objects that are blocked by a front car.
  • An AV may also obtain enhanced information for driving from the models and 3D maps, although its embedded or pre- installed sensor could be veiy powerful and updated.
  • Each AV may make its independent planning and decision based on the models and 3D map provided by DTRM node.
  • the 3D map represents multi-dimensional information, including information about road, traffic, and weather, etc., in a boundary of 3D space and time.
  • both AVs and DTRM (MEC) nodes may support fault detection of sensors. If hardware failure occurs on an AV, the AV system may detect the failure and send a request to DTRM node(s) for help.
  • a DTRM node may detect faults or failures happening to the AV, based on its knowledge (models), e.g., videos or pictures of a sensor are black out or have decode errors.
  • the DTRM node may advise the AV about potential faults or failures.
  • the DTRM node may provide information or data which is needed by the faulty or failed sensor as inputs. It may provide extra information or data which is compiled from multiple sources, e.g., other AVs, sensors on smart infrastructure, and so on.
  • FIG 11 is a flow diagram of embodiment operations 1100 for fault sensor detection.
  • Operations 1100 maybe indicative of operations occurring in an AV.
  • Operations 1100 begin with the AV detecting that a sensor has a hardware failure (block 1105). That is, the sensor does not work.
  • the sensor may be referred to as a failed sensor in the following description.
  • the AV may detect the failure when, e.g., no sensor data/ signal is received from the sensor, a video or a picture from the failed sensor is black out or has a decode error, and so on.
  • the AV may send a request to a DTRM node for help (block 1107).
  • the DTRM node may be one that the AV is being connected with.
  • the request may include information of the failed sensor.
  • the request may send may request for recovery of sensor data of the failed sensor by the DTRM node. That is, the AV may request the DTRM node to provide sensor data that can be used in place of sensor data of the failed sensor.
  • the DTRM node may acknowledge the request (block 1109).
  • a message exchange channel may be set up between the DTRM node and the AV, and the message exchange channel may be marked with a realtime requirement tag (block 1111).
  • the message exchange channel is used for exchanging messages between the DTRM node and the AV in response to the request of the AV.
  • the DTRM node may determine to obtain sensor data from one or more other sensors (referred to as replacement sensors) to be used in place of sensor data of the failed sensor.
  • the replacement sensors may include a fixed sensor and/or a sensor from another AV.
  • a replacement sensor may be the same sensor type as the faulty sensor.
  • the failed sensor and a replacement sensor may have the same or similar sensing range.
  • the failed sensor and a replacement sensor may be in close proximity (e.g., the location of the failed sensor is within a specified threshold of the location of the replacement sensor).
  • the DTRM node may send sensor data of the replacement sensors to the AV using the message exchange channel.
  • the AV may use the received sensor data in place of the sensor data of the failed sensor.
  • FIG 12 is a flow diagram of embodiment operations 1200 for fault sensor detection.
  • Operations 1200 maybe indicative of operations occurring in a DTRM node.
  • a sensor of an AV does not have a hardware failure, but the sensor signals are abnormal, e.g., the sensor is blocked/ covered by mud, snow, another object or another vehicle.
  • the AV may send sensor data of its sensors to the DTRM node.
  • the DTRM node receives the sensor data and detects that a signal or signals of the sensor is blacked out or unavailable (block 1205).
  • the DTRM node may send a notification to the AV, notifying the AV of the detection and may offer sensor input from one or more other sensors (e.g., from a fixed sensor or a sensor of another AV) (block 1207).
  • the AV may check and confirm the issue of the sensor and open an air socket ID of a socket for wireless communication (block 1209).
  • the AV confirms that the sensor has an issue, it may determine to communicate with the DTRM node regarding the issue, and open a dedicated real-time communication channel for this sensor, so that the AV receives DTRM data to replace the data of a malfunctioning sensor or sensors.
  • a message exchange channel may then be set up between the DTRM node and the AV and associated with the air socket ID, and the message exchange channel may be marked with a real-time requirement tag (block 1211).
  • the message exchange channel is used for exchanging messages between the DTRM node and the AV in response to the notification from the DTRM node.
  • FIG. 13 is a block diagram of a computing system 1300 according to an embodiment.
  • the computing system 1300 may be used for implementing the devices and methods disclosed herein.
  • the computing system can be a DTRM node or a DTRM site.
  • Specific devices may utilize all of the components shown or only a subset of the components, and levels of integration may vaiy from device to device.
  • a device may contain multiple instances of a component, such as multiple processing units, processors, memories, transmitters, receivers, etc.
  • the computing system 1300 includes a processing unit 1302.
  • the processing unit 1302 includes a central processing unit (CPU) 1314, memoiy 1308, and may further include a mass storage device 1304, a video adapter 1310, and an I/O interface 1312, with some or all of the components being connected to a bus 1320.
  • the memoiy 1308 stores instructions 1330, which are executable by one or more processors, such as the CPU 1314.
  • the bus 1320 may be one or more of any type of several bus architectures including a memoiy bus or memoiy controller, a peripheral bus, or a video bus.
  • the CPU 1314 may comprise any type of electronic data processor.
  • the memory 1308 may comprise any type of non-transitoiy system memoiy such as static random access memoiy (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), read-only memoiy (ROM), or a combination thereof.
  • the memory 1308 may include ROM for use at boot-up, and DRAM for program and data storage for use while executing programs.
  • the mass storage 1304 may comprise any type of non-transitoiy storage device configured to store data, programs, and other information and to make the data, programs, and other information accessible via the bus 1320.
  • the mass storage 1304 may comprise, for example, one or more of a solid state drive, hard disk drive, a magnetic disk drive, or an optical disk drive.
  • the video adapter 1310 and the I/O interface 1312 provide interfaces to couple external input and output devices to the processing unit 1302.
  • input and output devices include a display 1318 coupled to the video adapter 1310 and a mouse, keyboard, or printer 1316 coupled to the I/O interface 1312.
  • Other devices maybe coupled to the processing unit 1302, and additional or fewer interface cards may be utilized.
  • a serial interface such as Universal Serial Bus (USB) (not shown) may be used to provide an interface for an external device.
  • USB Universal Serial Bus
  • the processing unit 1302 also includes one or more network interfaces 1306, which may comprise wired links, such as an Ethernet cable, or wireless links to access nodes or different networks.
  • the network interfaces 1306 allow the processing unit 1302 to communicate with remote units via the networks.
  • the network interfaces 1306 may provide wireless communication via one or more transmitters/transmit antennas and one or more receivers/ receive antennas.
  • the processing unit 1302 is coupled to a local-area network (LAN) 1322 or a wide-area network (WAN) for data processing and communications with remote devices, such as other processing units, the Internet, or remote storage facilities. It should be appreciated that one or more steps of the embodiment methods provided herein may be performed by corresponding circuits, units, or modules.
  • a signal may be transmitted by a transmitting circuit, unit, or module.
  • a signal may be received by a receiving circuit, unit, or module.
  • a signal may be processed by a processing circuit, unit, or module.
  • the respective circuits, units, or modules may be hardware, software, or a combination thereof.
  • one or more of the circuits, units, or modules may be an integrated circuit, such as field programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs).
  • the computing system 1300 comprises a MEC node and the processing unit 1302 executes the instructions 1330 to receive mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area, receive traffic data and weather data of the geographical area.
  • the processing unit 1302 executes the instructions 1330 to generate real-time road, traffic, and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data.
  • the processing unit 1302 executes the instructions 1330 to generate a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic and weather models.
  • the processing unit 1302 executes the instructions 1330 to share, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
  • AV autonomous vehicle
  • the MEC node includes a reception module receiving mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area, receive traffic data and weather data of the geographical area, a generation module generating real-time road, traffic, and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data, a map module generating a three- dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic and weather models, and a share module sharing with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
  • AV autonomous vehicle
  • the MEC node may include other or additional modules for performing any one of or combination of steps described in the embodiments. Further, any of the additional or alternative embodiments or aspects of the method, as shown in any of the figures or recited in any of the claims, are also contemplated to include similar modules.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)

Abstract

A method implemented by a multi-access edge computing (MEC) node includes receiving mobile sensor data and fixed sensor data of sensors in a geographical area; receiving traffic data and weather data; generating real-time road, traffic, and weather models of the geographic area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data; generating a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographic area in accordance with the real-time road, traffic, and weather models; and sharing, with at least one autonomous vehicle (AV), the 3D map in accordance with geolocation information associated with the at least one AV.

Description

Methods and Apparatus for Supporting Autonomous Vehicles in Multi-Edge Computing Systems
TECHNICAL FIELD
The present disclosure relates generally to methods and apparatus for digital computing and communications, and, in particular embodiments, to methods and apparatus for supporting autonomous vehicles (AVs) in multi-edge computing (MEC) systems.
BACKGROUND
Multi-access edge computing (MEC) generally offers a new computing model that extends the capability of what is ordinarily available in single devices, thereby enabling resource-limited devices to not only perform tasks ordinarily outside of their capabilities, but also realize new tasks while keeping up with ever increasing demands.
As an example, autonomous vehicles (AVs) support a wide range of applications, such as applications for tracking, autonomous driving, safety, navigation, machine learning, machine vision, voice recognition, gesture recognition, and so forth. For these applications to operate at their fullest potential, accurate and timely sensor data from sensors located on the AVs is needed. However, the sensors may fail (for example, a sensor may be damaged when the AV bumps into an object, or a sensor may be rendered inoperative by weather conditions or by road dirt accumulating on the sensor, and so on). Inaccurate or untimely sensor data may prevent some of the applications executing in the AV from operating properly, or even more dangerously, the applications may execute in an erroneous fashion. Therefore, there is a need for methods and apparatus for supporting AVs in MEC systems.
SUMMARY
According to one aspect of the present disclosure, a method implemented by a first multi-access edge computing (MEC) node is provided. The method includes: receiving mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area; receiving traffic data and weather data of the geographical area; generating real-time road, traffic and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data; generating a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographic area in accordance with the real-time road, traffic, and weather models; and sharing, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the mobile sensor data being received from the at least one AV.
Optionally, in any of the preceding aspects, the mobile sensor data being received from a second AV within the geographic area.
Optionally, in any of the preceding aspects, the mobile sensor data being received from one or both of the at least one AV and a second AV within the geographic area.
Optionally, in any of the preceding aspects, the traffic data and the weather data being received from at least one of sensors or information services.
Optionally, in any of the preceding aspects, the method further comprises sharing, with the at least one AV, the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the sharing the real-time road, traffic, and weather models comprising sharing a portion of the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the method further comprises sharing, with a second MEC node, the 3D map in accordance with geolocation information associated with the second MEC node.
Optionally, in any of the preceding aspects, the method further comprises sharing, with the second MEC node, one or more of: the real-time road, traffic, and weather models, in accordance with the geolocation information associated with the second MEC node.
Optionally, in any of the preceding aspects, the sharing the 3D map comprising sharing a portion of the 3D map in accordance with the geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the method further comprises: receiving an update of at least one of: the fixed sensor data, the mobile sensor data, the traffic data, or the weather data; updating the 3D map in accordance with the update; and sharing, with the at least one AV, the updated 3D map in accordance with the geolocation information associated with the at least one AV. Optionally, in any of the preceding aspects, the method further comprises determining a presence of a faulty mobile sensor in the one or more mobile sensors or a faulty fixed sensor in the one or more fixed sensors, and based thereon, replacing sensor data associated with the faulty mobile sensor or the faulty fixed sensor with sensor data associated with a sensor within a specified distance from the faulty mobile sensor or the faulty fixed sensor.
Optionally, in any of the preceding aspects, the method further comprises receiving a request from a first AV connected with the MEC node, requesting the MEC node provide sensor data in place of sensor data of a faulty sensor of the first AV.
Optionally, in any of the preceding aspects, the method further comprises determining a fixed sensor or a mobile sensor whose sensor data is to be used in place of the sensor data of the faulty sensor of the first AV.
According to another aspect of the present disclosure, a multi-access edge computing (MEC) node is provided that includes: a non-transitoiy memoiy storing instructions; and at least one processor in communication with the memory, the at least one processor configured, upon execution of the instructions, to: receive mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area; receive traffic data and weather data of the geographical area; generate real-time road, traffic, and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data; generate a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic and weather models; and share, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the mobile sensor data being received from the at least one AV.
Optionally, in any of the preceding aspects, the mobile sensor data being received from a second AV within the geographic area.
Optionally, in any of the preceding aspects, the mobile sensor data being received from one or both of the at least one AV and a second AV within the geographic area.
Optionally, in any of the preceding aspects, the traffic data and the weather data being received from at least one of sensors or information services. Optionally, in any of the preceding aspects, the instructions causing the MEC node to share, with the at least one AV, the real-time road, traffic and weather models in accordance with the geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the instructions causing the MEC node to share a portion of the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the instructions further causing the MEC node to share, with a second MEC node, the 3D map in accordance with geolocation information associated with the second MEC node.
Optionally, in any of the preceding aspects, the instructions further causing the MEC node to share, with the second MEC node, one or more of: the real-time road, traffic, and weather models, in accordance with the geolocation information associated with the second MEC node.
Optionally, in any of the preceding aspects, the 3D map comprising information of the at least one AV.
Optionally, in any of the preceding aspects, the instructions causing the MEC node to: receive an update of at least one of: the fixed sensor data, the mobile sensor data, the traffic data, or the weather data; update the 3D map in accordance with the update; and share, with the at least one AV, the updated 3D map in accordance with the geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the instructions causing the MEC node to: determine a presence of a faulty mobile sensor in the one or more mobile sensors or a faulty fixed sensor in the one or more fixed sensors; and based thereon, replace sensor data associated with the faulty mobile sensor or the faulty fixed sensor with sensor data associated with a sensor within a specified distance from the faulty mobile sensor or the faulty fixed sensor.
Optionally, in any of the preceding aspects, the instructions further causing the MEC node to receive a request from a first AV connected with the MEC node, the request requesting the MEC node provide sensor data in place of sensor data of a faulty sensor of the first AV. Optionally, in any of the preceding aspects, the instructions further causing the MEC node to determine a fixed sensor or a mobile sensor whose sensor data is to be used in place of the sensor data of the faulty sensor of the first AV.
According to another aspect of the present disclosure, a first multi-access edge computing (MEC) node is provided that includes: a messaging agent configured to communicate messages with a second MEC node, configured to communicate messages with at least one autonomous vehicle (AV) operating within coverage of the first MEC node, configured to receive sensor data from sensors in a geographical area, and configured to receive data from information services; a modeling service operatively coupled to the messaging agent, the modeling service configured to generate real-time road, traffic, and weather models of the geographical area in accordance with the sensor data and the data received from the information services; and a fusion service operatively coupled to the modeling service, the fusion service configured to generate a three- dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic, and weather models.
Optionally, in any of the preceding aspects, the fusion service configured to validate sensor data received from mobile sensors and fixed sensors in the geographical area, track AVs operating within the coverage of the first MEC node, and maintain a connection state with the at least one AV.
Optionally, in any of the preceding aspects, the messaging agent configured to receive updated sensor data or updated data from the information services.
Optionally, in any of the preceding aspects, the modeling service configured to update the real-time road, traffic, or weather models in accordance with the updated sensor data or the updated data from the information services.
Optionally, in any of the preceding aspects, the fusion service configured to update the 3D map in accordance with the updated real-time road, traffic, or weather models.
Optionally, in any of the preceding aspects, the messaging agent configured to share the updated 3D map in accordance with geolocation information associated with the at least one AV.
Optionally, in any of the preceding aspects, the messaging agent configured to share, with the second MEC node, the updated 3D map in accordance with geolocation information associated with the second MEC node. Optionally, in any of the preceding aspects, the messaging agent configured to share, with the second MEC node, the updated real-time road, traffic, or weather models in accordance with geolocation information associated with the second MEC node.
An advantage of the disclosed embodiments is that a multi-edge computing (MEC) system uses data from a distributed system of sensors and information services to provide accurate and timely information to autonomous vehicles (AVs), enabling the proper and safe execution of applications by the AVs.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of the present disclosure, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
Figure 1 is a diagram of an autonomous vehicle (AV) highlighting different types of sensors deployed on an AV;
Figure 2 is a diagram of the AV highlighting a faulty radar sensor;
Figure 3 is a diagram of a portion of a smart road;
Figure 4 is a diagram of a system highlighting a resource pool used to provide support for the AV operation according to example embodiments presented herein;
Figure 5A is a diagram of the AV and dynamic traffic road management (DTRM) nodes providing computational resources or data processing to the AV at a first time instance according to example embodiments presented herein;
Figure 5B is a diagram of the AV and the DTRM nodes providing computational resources or data processing to the AV at a second time instance according to example embodiments presented herein;
Figure 6 is a diagram of a smart road deployment with a DTRM system providing AV operation support according to example embodiments presented herein;
Figure 7 is a diagram of a portion of a DTRM system deploying an autonomous driving support system, highlighting a DTRM node, services implemented at the DTRM node for supporting AVs, and AVs according to example embodiments presented herein; Figure 8 is a diagram of communication between and operations performed by participants involved in autonomous driving or providing support for autonomous driving according to example embodiments presented herein;
Figure 9 is a flow diagram of example operations occurring in a DTRM node supporting AVs according to example embodiments presented herein;
Figure 10 is a flow diagram of example operations occurring in a DTRM node updating models and the 3D map according to example embodiments presented herein; and
Figure 11 is a flow diagram of embodiment operations for fault sensor detection;
Figure 12 is a flow diagram of embodiment operations for fault sensor detection; and
Figure 13 is a block diagram of a computing system according to an embodiment.
DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
The structure and use of disclosed embodiments are discussed in detail below. It should be appreciated, however, that the present disclosure provides many applicable concepts that can be embodied in a wide variety of specific contexts. The specific embodiments discussed are merely illustrative of specific structure and use of embodiments, and do not limit the scope of the disclosure.
Figure 1 is a diagram too of an autonomous vehicle (AV) 105 highlighting different types of sensors deployed on the AV 105. As shown in Figure 1, AV 105 is on a road no, along with vehicles 115-118. One or more of vehicles 115-118 are AVs, while the remainder is/are not.
AV 105 includes a plurality of sensors, including cameras (optical wavelength, infrared wavelength, stereo, etc.), radars (short range, long range, etc.), light detection and ranging (LiDAR) sensors, and so on. The sensors may be deployed at different locations of AV 105. As an example, a first camera and a stereo camera may be deployed at the front end of AV 105, while a second camera may be deployed at the rear end of AV 105, and radars may be deployed at the four corners of AV 105. Figure 1 illustrates example coverage areas of the sensors deployed in AV 105.
In general, sensors deployed in an AV are robust and are well protected. However, the sensors may become damaged in an accident. As an example, a sensor deployed in the bumper of an AV may be damaged if the AV bumps into a wall during parking or strikes an animal during driving. Additionally, dirt and debris from the road may also impact sensor function. As an example, dirt, mud, or ice may collect on a sensor and negatively impact sensor function. Furthermore, inclement weather may reduce the effectiveness of the sensor. As an example, heavy rain, snow, or fog may reduce the visual acuity of cameras.
Figure 2 is a diagram 200 of AV 105 highlighting a faulty radar sensor 205. Faulty radar sensor 205 may arise from physical damage to the radar sensor 205. Alternatively, dirt or mud may have accumulated on the radar sensor 205, negatively impacting the performance of the radar sensor 205. As a result of faulty radar sensor 205, a portion (shown in diagram 200 as region 210) of the sensor range of AV 105 is missing or inaccurate. Hence, AV 105 may incapable of detecting pedestrians, vehicles, or other obstructions within region 210. Because of the missing or inaccurate sensor information within region 210, AV 105 may not be able to operate in a fully autonomous manner, or AV 105 may require driver intervention to operate safely. This defeats the concept of autonomous driving.
However, there may be more than one AV operating within a given location at any given time. Furthermore, as smart road deployment becomes more widespread, sensors and detection equipment associated with smart roads become more widely available.
Figure 3 is a diagram 300 of a portion of a smart road 305. The portion of smart road 305 is of an intersection of two roads. Smart road 305 includes sensors 310-317, which may be magnetic sensors, optical sensors, cameras, speed detection devices, weather sensors, temperature sensors, etc., or any combination thereof. Additional sensors can be used to generate information for AVs, and such additional sensors are within the scope of this specification and claims. Operating on smart road 305, there may be a mix of AVs 320-329 and non-AV vehicles 330-331.
Therefore, within a relatively small geographic area, there may be a wide variety of sensors that can provide information that can be supportive of autonomous driving. As an example, a first AV may use information from a second AV that is near the first AV to supplement information from a failed or impaired sensor in the first AV. The first AV may obtain the information of the second AV directly through a connection established between the first and the second AVs, or indirectly through a relay, a centralized process system or server, or a distributed processing system. For discussion purposes, consider a situation where AV 320 has several faulty sensors, then it may be possible to utilize sensor data from AV 321 and AV 325 to supplement missing information from the failed sensors of AV 320. As another example, information from multiple AVs, as well as sensors of a smart road, may be aggregated to augment the autonomous driving of the AVs operating in the vicinity of the sensors and AVs. For discussion purposes, consider a situation where sensor data from AVs 320-328 and sensors 310-317 are aggregated to improve the autonomous driving performance of AVs 320-328. However, where there is a large amount of sensor data to process, the computational requirements may exceed the capabilities of a single AV or centralized computing system. Therefore, there is a need for methods and apparatus for supporting AVs in distributed computing systems.
According to an example embodiment, a distributed computing system, e.g., a multi-edge computing (MEC) system, supporting AVs is provided. A MEC system for supporting AVs, may be referred to as a dynamic traffic road management (DTRM) system. As an example, the DTRM system provides support in terms of data or information support, as well as computation support. In general, the DTRM system may be utilized to reduce the computational resource requirements of an AV, as well as data storage and movement requirements of the AV.
The DTRM system may provide data or information support in terms of aggregating data or information from multiple sources and then providing the aggregated data or information to an AV. As an example, the amount of data being delivered to the AV may exceed the storage capabilities of the AV. In such a situation, the DTRM system may perform data processing and deliver only the data that is relevant to the AV, thereby reducing the storage and computational requirements of the AV. The DTRM system may provide computational resource support to an AV in a situation where the AV requires more computation resources than available on the AV. As an example, the processing of the data intended for an AV may in some instances exceed the computational resources available at the AV (while ensuring sufficient computational resources remain available to ensure safe autonomous vehicle operation). In such a situation, the DTRM system may perform at least a portion of the computation and provide the results of the computation to the AV, thereby freeing precious computational resources at the AV.
As an example, the DTRM system provides data handling and processing support for AVs. In an example, the DTRM system aggregates sensor data and information from information services and provides the aggregated information to the AVs. The aggregated information may be provided to the AVs in a raw (e.g., unprocessed) form or in a processed form. The aggregated information may be provided to the AVs based on the location of the AVs. As an example, an AV is provided aggregated information that is with a specified distance from the AV or at an expected (or predicted) location of the AV. By providing relevant information to the AV, the amount of information being transmitted to the AVs is reduced, thereby reducing the communication traffic load on the communication network, as well as reducing the storage and processing needs at individual AVs. In another example, the DTRM system provides sensor data and information that is ordinarily unavailable at the AV to ensure safe autonomous operation by augmenting the data and information available at the AV. The AV can make use of the provided sensor data to improve autonomous operation.
As an example, the DTRM system provides processing support for AVs. In an example, the DTRM system performs computational operations on aggregated sensor data and information from information services to develop models or maps to help the AV operate autonomously in an efficient manner. In an example, the DTRM system combines sensor data (from mobile sensors in AVs and fixed sensors deployed in and around the smart road) and information from information services to develop models or maps of the traffic, roads, weather, etc., using available computational resources to help the AV operate autonomously in a situation when the AV has a faulty or impaired sensor.
According to an example embodiment, sensor data associated with a faulty or impaired sensor is replaced with sensor data of an alternate sensor located near the faulty or impaired sensor. If the alternate sensor is located within a specified distance of the faulty or impaired sensor and if the alternate sensor and the faulty or impaired sensor are of the same sensor type, the sensor data of the alternate sensor may be a suitable substitute for the faulty or impaired sensor. In an embodiment, sensor data from a plurality of alternate sensors may be used as a substitute for the sensor data from the faulty or impaired sensor. In general, larger numbers of alternate sensors may yield better performance. However, too many alternate sensors may unnecessarily increase the amount of sensor data to process and needlessly increase the computational requirements without significantly improving performance.
Figure 4 is a diagram of a system 400 highlighting a resource pool used to provide support for AV 405 operation. As shown in Figure 4, AV 405 is in motion along route 406 and is processing sensor data from sensors 407 and information services 408. AV 405 is in coverage of a DTRM system.
As AV 405 moves along route 406, AV 405 may leave the coverage of a first antenna 410 and enter the coverage of a second antenna 411 or a third antenna 412. In some situations, AVs may be in the coverage of multiple antennas. As shown in Figure 4, antennas 410, 411, and 412 make up a radio access network (RAN) 413 providing connectivity for AV 405 with a DTRM system 414. A resource pool 415 represents the available DTRM nodes (MEC nodes) of DTRM sites (MEC sites) of DTRM system 414 that are capable of communicating with AV 405 as AV 405 moves along route 406. A DTRM site corresponds to a coverage area or service area within which MEC services are provided, and may include one or more DTRM nodes providing the MEC services, such as data/information processing and storage described above. A DTRM site may be configured with a DTRM system. Each DTRM node may be viewed as a data center or an edge cloud deployed closely to users. Each DTRM node may provide computing resources, memoiy resources, storage resources, etc., for executing applications in virtual machines (VMs) or in containerized environments, such as containers and dockers. In general, each DTRM node has considerably more resources (e.g., multiple multi-core processors, multiple gigabytes of memoiy, high bandwidth connectivity, etc.) than devices of users. In an embodiment, resource pool 415 includes available DTRM nodes that are capable of communicating with AV 405 within a particular time window. Hence, resource pool 415 may exclude DTRM nodes that are capable of communicating with AV 405, but only at a significant time in the future. Excluding such DTRM nodes may be advantageous because AV 405 may no longer be on route 406 so far in the future and planning for such a possibility may be a waste of resources. Furthermore, uncertainty typically increases with time, and hence planning so far into the future may yield inaccurate results.
As an example, resource pool 415 may include a subset of the DTRM nodes of DTRM sites 416, 417, 418, and 419. DTRM side 419 may be a redundant computer resource interconnecting DTRM sites 416, 417 and 418, and work as a backup computing resource. It may also be a high level computing node that help coordinate computing between DTRM sites 416, 417 and 418. In general, a DTRM node may be a part of resource pool 415 for AV 405 if the DTRM node has sufficient computation resources to provide computational resources or data processing to AV 405 while AV 405 is within coverage of a DTRM site including the DTRM node. If the DTRM node has insufficient computational resources or cannot provide data processing to AV 405 while AV 405 is within coverage of the DTRM site associated with the DTRM node, the DTRM node is not a member of resource pool 415.
Figure 5A is a diagram 500 of AV 505 and DTRM nodes providing computational resources or data processing to AV 505 at a first time instance. At the first time instance, DTRM nodes 511, 512, 513, and 514 (shown as shaded circles) are providing computational resources or data processing to AV 505 (traversing route 506), while other DTRM nodes, such as DTRM nodes 510 and 515 (shown as clear circles) are not chosen to provide computational resources or data processing (or at least not yet providing computational resources or data processing) to AV 505. Although the discussion focuses on DTRM nodes, the discussion also applies to DTRM sites.
DTRM nodes 511, 512, 513, and 514 may be DTRM nodes that are within a time window of a current location of AV 505. The DTRM nodes that are within the time window may be expected to be serving AV 505 or will be serving AV 505 with a certain probability. The size of the time window may vaiy based on factors including environmental factors, traffic information, emergency information, etc. In general, the time window should be sufficiently large to enable efficient scheduling of multiple DTRM nodes at any one time, however, the time window should not be too large else the uncertainty associated with mobility of AV 505 may result in too many scheduled jobs and tasks having to be rescheduled.
As an alternative to a time window, the number of DTRM nodes providing computational resources or data processing to AV 505 may be a pre-specified value. As an example, in Figure 5A, the number of DTRM nodes is four, although other numbers may be prespecified. The number of DTRM nodes may be related to the time window. As an example, if AV 505 is moving rapidly, the number of DTRM nodes may be greater, while if AV 505 is moving slowly, the number of DTRM nodes may be lesser.
Figure 5B is a diagram 550 of AV 505 and DTRM nodes providing computational resources or data processing to AV 505 at a second time instance. The second time instance occurs later in time than the first time instance. At the second time instance, DTRM nodes 512, 513, 514, and 515 are within the time window and are providing computational resources or data processing to AV 505 (traversing route 506), while other DTRM nodes, such as DTRM nodes 510, 511, and 516 are not chosen to provide computational resources or data processing (or at least not yet providing computational resources or data processing) to AV 505.
In an embodiment, the geolocation information of an AV is used to control the support provided to the AV by the DTRM system. The geolocation information of an AV includes the current location of the AV, the estimated location of the AV at a future time, the velocity of the AV, the direction of the AV, the destination of the AV, the projected path of the AV, and so on. The geolocation information of the AV may be determined in accordance with location information provided by a Global Navigation Satellite System (GNSS) (such as Global Positioning System (GPS), Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS), Galileo, BeiDou Navigation Satellite System (BDS), etc.), cellular communication system based location measurement information, velocity information, and so on. The geolocation information may be used to determine which parts of the DTRM system may be used to provide support to the AV. Additionally, the geolocation information may be used to determine which type of support is provided to the AV. As an illustrative example, one or more DTRM nodes of the DTRM system are assigned to support the AV in accordance with the geolocation information of the AV. The one or more DTRM nodes may change as the AV moves along its route, e.g., a first DTRM node may be unassigned from providing support to the AV as the AV exits the coverage of the first DTRM node, while a second DTRM node may be assigned to provide support to the AV as the AV enters the coverage of the second DTRM node.
In an embodiment, the geolocation information of an AV is used to control the augmentation of sensor data used to support the operation of the AV. The geolocation information may be used to identify the sensor data from specific sensors that may be used to augment the sensor data provided by the sensors of the AV. As an illustrative example, the geolocation information of the AV is used to identify AVs that are within a specified distance (i.e., "close") of the AV, as well as fixed sensors of the smart road, that may provide sensor data usable in augmenting the sensor data of the AV. The sensor data from the AV, along with the sensor data from the identified AVs and fixed sensors, may be processed to support the operation of the AV. In an embodiment, information from information services is also identified by the geolocation information of the AV and used to support the operation of the AV. The information from the information services may include topography information, geography information, navigation system information, traffic information, emergency services information, weather information, and so on.
Co-assigned International patent application No. PCT/US2020/061815, filed on November 23, 2020, and entitled "Methods and Apparatus for Supporting Application Mobility in Multi-Access Edge Computing Platform Architectures," which is hereby incorporated herein by reference in its entirety, discloses methods and apparatus supporting service mobility in MEC systems (e.g., DTRM systems). A service, such as autonomous driving, may be partitioned into tasks and mapped onto MEC nodes (e.g., DTRM nodes) of MEC sites (e.g., DTRM sites) based on the requirements of the service and geolocation information of the AV. The partitioning and mapping are performed so that the service is continuous while the AV moves along its route.
In an embodiment, sensor data that is missing or unreliable due to a faulty or damaged sensor is replaced by sensor data from functional sensors (that are functioning normally) in accordance with the geolocation information associated with the faulty or damaged sensor. For example, the sensor data from the faulty or damaged sensor may be replaced with sensor data from a functional sensor in a situation where the faulty or damaged sensor and the functional sensor are in close proximity (e.g., the location of the faulty or damaged is within a specified threshold of the location of the functional sensor). The specified threshold is a predefined value or a value that can be dynamically adjusted based on operating conditions. As an example, when the operating condition is poor (e.g., rain, snow, sleet, fog, dark, etc.), the specified threshold is small, while when the operating condition is good (e.g., clear sky, day, no fog, etc.), the specified threshold is large.
In an embodiment, the faulty or damaged sensor and the functional sensor are of the same sensor type. In another embodiment, the faulty or damaged sensor and the functional sensor have the same or similar sensing range. In other words, the sensor data from the faulty or damaged sensor and from the functional sensor should cover approximately the same area or the sensing range of the functional sensor is a superset of the sensing range of the faulty or damaged sensor. The functional sensor should "see" about the same or more than the faulty or damaged sensor.
In an embodiment, the sensor data from the faulty or damaged sensor is replaced with sensor data from a plurality of functional sensors. In a situation when there is a plurality of functional sensors, the sensor data from the plurality of functional sensors may be used in place of the sensor data from the faulty or damaged sensor. In an embodiment, the specified threshold may be relaxed when there are multiple functional sensors. In general, the amount of relaxation of the specified threshold is dependent on the number of functional sensors.
Figure 6 is a diagram of a smart road deployment 600 with a DTRM system providing AV operation support. Smart road deployment 600 supports the operation of a plurality of AVs, including AVs 605-609. Smart road deployment 600 includes a DTRM system with a plurality of DTRM sites, including DTRM sites 610, 612, and 614, with coverage areas 611, 613, and 615, respectively.
Each AV includes one or more sensors (SNSRs), which are referred to as mobile sensors due to their location on the AVs. Sensor data from the mobile sensors may be utilized by the AV on which they are deployed. The sensor data may be communicated to the DTRM system, where they may be used to augment the operation of the AVs. Smart road deployment 600 also includes sensors (such as sensors 620-628) that are deployed on or in the roads, in or on buildings, on lights, on signs, and so on. These sensors are referred to as fixed sensors due to their being immobile (when compared to the mobile sensors of AVs). Sensor data from the fixed sensors may be communicated to the DTRM system, where they can be used to augment the operation of the AVs. As an illustrative example, sensor data from sensor 623 may be used to replace sensor data from a faulty sensor located on AV 605, while sensor data from sensors 623 and 625 may help support the operation of AV 606. As shown in Figure 6, AV 606 is in coverage area 615 of DTRM site 614, while sensor 623 is in the coverage area 611 of DTRM site 610. In such a situation, sensor data from sensor 623 may be shared with DTRM site 614. As another illustrative example, sensor data from AV 606 may be used to replace sensor data from a faulty sensor located in AV 605.
As shown in Figure 6, sensor data from mobile sensors or fixed sensors may be used to replace sensor data from faulty or damaged sensors. The sensor data may be utilized in one or more DTRM sites to help support the operation of AVs with faulty or damaged sensors. As an example, sensor data from fixed sensor 627 may be used to replace or augment sensor data from a faulty or damaged sensor of AV 607.
The sensor data may also be used to support the operation of AVs without any faulty or damaged sensors. In other words, the sensor data may be used to enhance the operation of fully operational AVs. As an example, with sensor data from additional sensors, the operation of AVs may be improved, e.g., in safety, speed, and so on. As an example, even if AV 608 has no faulty or damaged sensors, sensor data from fixed sensors 626 and 628 may be used to help improve the operation of AV 608. The examples presented herein are intended for illustrative purposes and are not intended to limit the scope of the example embodiments.
According to an example embodiment, services are deployed at DTRM nodes of a DTRM system to support the operation of AVs. Deploying services at DTRM nodes implements a distributed system for supporting the operation of AVs. The distributed implementation helps to reduce the overhead associated with communicating the sensor data from mobile and fixed sensors deployed in the DTRM system, as well as distributing information usable in assisting the AVs throughout the DTRM system. Furthermore, the distributed implementation facilitates a level of fault tolerance and redundancy that is important in a time-critical application, such as autonomous driving.
Figure 7 is a diagram of a portion of a DTRM system 700 deploying an autonomous driving support system, highlighting a DTRM node 705, services implemented at DTRM node 705 for supporting AVs, and AVs 710-712. DTRM node 705 is connected to other DTRM nodes of DTRM sites of DTRM system 700. AVs 710-712 may have subscriptions with DTRM system 700, and may also be referred to as subscribed AVs. DTRM node 705 is connected to AVs 710-712, providing support for autonomous driving by AVs 710-712. Although shown in Figure 7 as supporting three AVs, the number of AVs supported by a single DTRM node is typically dynamic, changing as the AVs move in and out of the coverage area of the DTRM node 705. As an example, DTRM node 705 may begin to support a new AV when the new AV enters the coverage area of DTRM node 705, and stop supporting the new AV when the new AV exits the coverage area of DTRM node 705. Typically, when an AV exits the coverage area of a DTRM node, it will enter the coverage area of another DTRM node. However, it is possible that the AV will exit the coverage of DTRM system 700 entirely. In such a situation, the AV may stop receiving autonomous driving support unless another DTRM system is available and the AV has a subscription with the other DTRM system.
DTRM node 705 includes a messaging service 720 that is configured to support communications for DTRM node 705. Messaging service 720 processes messages sent from and received by DTRM node 705. Messaging service 720 includes a message agent 722 that is configured to provide a messaging service. Message agent 722 may be configured to establish, manage, and control a messaging flow and message exchanges associated with the messaging flow. Messages exchanged may include information of a mobile device (e.g., devices in an AV), a user, an account of the mobile device or the user, a service associated with an application, security, priority, data to be processed or that has been processed, policy, regional map, flow table, states of DTRM nodes, scheduling information, tasks/jobs, and any other information related to execution of a service provided to an AV. As an example, message agent 722 may receive a messaging service request, e.g., from another DTRM node or from an AV, verify the messaging service request, establish and tear down a messaging flow, create a virtual messaging network, generate and assign a messaging flow ID, assist in the performing of planning and scheduling of tasks and jobs, distribute scheduling information, distribute the tasks and jobs, deliver to-be-processed and processed data, and maintain a regional map and flow table. Messaging service 720 also includes an alarm message agent 724 that is configured to process alarm messages. The alarm messages may be used to inform an AV of a potentially dangerous situation, for example. The alarm messages may also be used to alert DTRM node 705 of a potentially dangerous situation.
Each AV connected to DTRM node 705, e.g., AVs 710-712, includes a message agent that is configured to establish, manage, and control a messaging flow and message exchanges associated with the messaging flow. In other words, the message agents in the AVs support communication by the AV. DTRM node 705 includes a traffic-road (TR) modeling service 730 that is configured to generate and maintain models of the roads, traffic, and weather. TR modeling service 730 makes use of sensor data (from mobile sensors and fixed sensors, for example), as well as information from information services (e.g., weather services, traffic services, emergency services, road condition services, map services, and so on).
TR modeling service 730 includes a road model agent 732 that is configured to generate and maintain a road model using sensor data and information from information services. As an example, using mapping information from map services, in conjunction with sensor data, road model agent 732 generates a road model of the roads covered by DTRM system 700, highlighting road construction, road damage, road condition (e.g., icy, slick, wet, etc.), and so on. Road model agent 732 also updates the road model based on changes in the sensor data and the information from information services. As an example, a formerly icy road may no longer be icy, road construction may have completed at one location, while road construction has started at another.
Road model agent 732 provides the road model to the AVs connected to DTRM node 705. The road model provided to a particular AV is in accordance with the geolocation information of the AV. As an example, only a portion of the road model that is within a specified distance of the AV is provided to the AV, where the specified distance may be a distance that the AV is expected to travel within a given time duration, for example. Providing only the portion of the road model to the AV helps reduce the amount of data communicated to the AVs. The road model may also be provided to other DTRM nodes, such as DTRM nodes that are within a specified distance from DTRM node 705.
TR modeling service 730 includes a weather model agent 734 that is configured to generate and maintain a weather model using sensor data and information from information services. As an example, using weather information from weather services, in conjunction with sensor data, weather model agent 734 generates a weather model for the roads covered by DTRM system 700, highlighting weather conditions of roads covered by DTRM system 700. As an example, the weather model indicates rain, snow, ice, fog, etc., for the roads covered by DTRM system 700. Weather model agent 734 also updates the weather model based on changes in the sensor data and the information from information services. As an example, weather model agent 734 refines the weather condition of the roads based on sensor data received from mobile and fixed sensors throughout DTRM system 700. As an example, weather model agent 734 may set up a message channel with message agent 722, and receive updated data from surrounding mobiles and fixed sensors in DTRM system 700. Weather model agent 734 may fuse all received data based on its weather model, and refine the weather model to improve the accuracy of the weather model.
Weather model agent 734 provides the weather model to the AVs connected to DTRM node 705. The weather model provided to a particular AV is in accordance with the geolocation information of the AV. As an example, only a portion of the weather model that is within a specified distance of the AV is provided to the AV, where the specified distance may be a distance that the AV is expected to travel within a given time duration, for example. Providing only the portion of the weather model to the AV helps to reduce the amount of data communicated to the AVs. The weather model may also be provided to other DTRM nodes, such as DTRM nodes that are within a specified distance from DTRM node 705.
TR modeling service 730 includes a traffic model agent 736 that is configured to generate and maintain a traffic model using sensor data and information from information services. As an example, using traffic information, emergency information, in conjunction with sensor data, traffic model agent 736 generates a traffic model for the roads covered by DTRM system 700, highlighting traffic on the roads. As an example, traffic model indicates traffic congestion, flow, speeds, etc., for the roads covered by DTRM system 700. Traffic model agent 736 also updates the traffic model based on changes in the sensor data and the information from information services. As an example, traffic model agent 736 alters the traffic condition of the roads based on the sensor data and the information from information services.
Traffic model agent 736 provides the traffic model to the AVs connected to DTRM node 705. The traffic model provided to a particular AV is in accordance with the geolocation information of the AV. As an example, only a portion of the traffic model that is within a specified distance of the AV is provided to the AV, where the specified distance may be a distance that the AV is expected to travel within a given time duration, for example. Providing only the portion of the traffic model to the AV helps to reduce the amount of data communicated to the AVs. The traffic model may also be provided to other DTRM nodes, such as DTRM nodes that are within a specified distance from DTRM node 705.
The TR modeling service is also configured to generate a three-dimensional (3D) map based on the traffic mode, weather model and road model. As an example, the 3D map may show static objects and dynamic moving objects in a 3-D space, and may show/ indicate weather conditions, traffic lights, traffic incidents, and so on. The 3D map may be provided to AVs connected to DTRM node 705. The 3D map provided to a particular AV is in accordance with the geolocation information of the AV. As an example, only a portion of the 3D map that is within a specified distance of the AV is provided to the AV, where the specified distance may be a distance that the AV is expected to travel within a given time duration, for example. The 3D map is dynamic, and may be updated based on update of the traffic mode, weather model and road model.
DTRM node 705 includes a TR fusion service 740 that is configured to provide an interface for inputs from different sources (e.g., mobile sensors, fixed sensors, information services, etc.) with TR modeling service 730. TR fusion service 740 communicates with TR modeling service 730 through messaging service 720.
TR fusion service 740 includes a connection management unit 742 that is configured to maintain AV connection state. Connection management unit 742 helps establish a connection with an AV as the AV enters the coverage area of DTRM node 705 and helps handover the AV as the AV exits the coverage area of DTRM node 705, for example. TR fusion service 740 includes a fixed sensor management unit 744 that is configured to compare and validate sensor data from fixed sensors. TR fusion service 740 also includes motion sensor management unit 746 that is configured to compare and validate sensor data from mobile sensors. Fixed sensor management unit 744 and motion sensor management unit 746 help ensure that sensor data from respective sensors are valid.
TR fusion service 740 includes a sensor fusion unit 748 that is configured to combine sensor data from multiple sources (e.g., multiple mobile sensors from multiple AVs, multiple fixed sensors, a combination of mobile sensors and fixed sensors, etc.). Sensor fusion unit 748 may make use of information from fixed sensor management unit 744 and motion sensor management unit 746 in the combining of the sensor data. As an example, sensor fusion unit 748 eliminates sensor data from faulty or damaged sensors and replaces the sensor data with sensor data from other sensors that are within a specified threshold of the faulty or damaged sensors.
TR fusion service 740 includes a vehicle tracker 750 that is configured to maintain and update AV location in the dynamic three-dimensional (3D) map. As an example, vehicle tracker 750 obtains the location of the AVs (e.g., from geolocation information of the AVs) and updates the dynamic 3D map. TR fusion service 740 also includes a dynamic 3D map unit 752 that is configured to maintain and update the dynamic 3D map in accordance with sensor data and information from information services. As an example, dynamic 3D map unit 752 updates the road conditions, weather conditions, traffic conditions, etc., based on the sensor data and information from information services received by TR fusion service 740. The updated dynamic 3D map may be provided to subscribers and other DTRM nodes. Figure 8 is a diagram 800 of communication between and operations performed by participants involved in autonomous driving or providing support for autonomous driving. The participants involved in autonomous driving or providing support for autonomous driving include AV 805, messaging agent 807, DTRM node 809, and authentication & policy unit 811. Authentication & policy unit 811 is an entity that may be a part of DTRM node 809, a DTRM system including DTRM node 809, or an infrastructure system that is used to authenticate and authorize participants (e.g., AV 805), to ensure that the participants have a valid subscription, for example. As an example, authentication & policy unit 811 may be an authentication, authorization, and accounting (AAA) function of a Third Generation Partnership Project (3GPP) compliant network.
AV 805 sends a connection request to messaging agent 807 of DTRM node 809 (event 815). AV 805 sends the connection request to initiate the connection process with DTRM node 809, for example. AV 805 may send the connection request when it comes within coverage range of DTRM node 809. The connection request may include an identifier of AV 805. Messaging agent 807 forwards the connection request to DTRM node 809 (event 817). The forwarded connection request may be identical to the connection request received from AV 805, or it may be altered.
DTRM node 809 sends an authentication request (event 819). The authentication request may be sent to authentication & policy unit 811 and is a request to authenticate AV 805. In other words, the authentication request checks to ensure that AV 805 is an authorized user of services of DTRM node 809. Authentication & policy unit 811 performs a policy and quality of service (QoS) check (block 821). The policy and QoS check may be a check of the authenticity of AV 805, for example. The policy and QoS check may also be a check of the QoS level requested by AV 805. As an example, the policy and QoS check determines if AV 805 has a subscription sufficient to meet the QoS level. Authentication & policy unit 811 sends an authentication confirmation (event 823). The authentication confirmation is sent to DTRM node 809 and confirms whether or not AV 805 has been authenticated and meets the QoS level. For discussion purposes, for the remainder of the discussion of Figure 8, it is assumed that AV 805 has successfully authenticated and met the QoS level.
DTRM node 809 sends a tracking identifier (ID) for AV 805 (event 825). The tracking identifier is transmitted to messaging agent 807 and may be used to identify AV 805 while AV 805 is connected to DTRM node 809. As an example, messages sent to AV 805 may be identified using the tracking identifier as the destination address. Similarly, messages sent by AV 805 may be identified using the tracking identifier as the source address. Messaging agent 807 assigns a messaging flow identifier for AV 805 and sends the messaging flow identifier (event 827). The messaging flow identifier is used to identify message flows from or to AV 805, for example. The messaging flow identifier is sent to AV 805. DTRM node 809 confirms the messaging connection (event 829).
AV 805 sends sensor data (event 831). The sensor data, from sensors of AV 805, is sent to messaging agent 807. Messaging agent 807 sends the sensor data to DTRM node 809. In addition to the sensor data, information from information services may be sent to DTRM node 809. DTRM node 809 generates the traffic, weather, and road models (block 833). The traffic, weather, and road models may be generated in accordance with the sensor data and the information from information services, for example. In addition to the traffic, weather, and road models, DTRM node 809 generates a 3D map in accordance with the traffic, weather, and road models. DTRM node 809 provides the models to AVs (event 835). The traffic, weather, and road models may be sent to messaging agent 807, which forwards the models to AV 805. The traffic, weather, and road models are sent in accordance with the geolocation information of AV 805. As an example, only portions of the traffic, weather, and road models relevant to AV 805 are sent to AV 805. Similarly, the 3D map is provided to the AVs in accordance with the geolocation information of the AVs.
AV 805 sends updated sensor data (event 837). In general, sensors provide a stream of sensor data at specific intervals, e.g., a specified number of times per second. AV 805 sends the updated sensor data, as generated by its sensors, to messaging agent 807. Messaging agent 807 forwards the updated sensor data to DTRM node 809. DTRM node 809 updates the 3D map and models (block 839). The 3D map and models are updated in accordance with the updated sensor data and the information from information services, for example. DTRM node 809 provides the updated 3D map and models to the AVs. As an example, DTRM node 809 provides the updated 3D map and models to AV 805 in accordance with the geolocation information of AV 805.
As AV 805 moves, AV 805 exits the coverage area of DTRM node 809. In such a situation, AV 805 participates in a handover with another DTRM node and the session (the connection) is passed to the other DTRM node (event 843). Because AV 805 is being served by another DTRM node, messaging agent 807 stops the DTRM service for AV 805 (event 845) and DTRM node 809 removes the tracking identifier associated with AV 805 (block 847) . Should AV 805 re-enter the coverage area of DTRM node 809, a new tracking identifier may be assigned to AV 805. Figure 9 is a flow diagram of example operations 900 occurring in a DTRM node supporting AVs. Operations 900 may be indicative of operations occurring in a DTRM node as the DTRM node supports AVs by generating traffic, weather, and road models, as well as 3D map, and providing the models and 3D map to the AVs. Furthermore, the DTRM node augments sensor data from faulty or damaged sensors with sensor data from other sensors.
Operations 900 begin with the DTRM node receiving sensor data (block 905). The sensor data may be received from AVs connected to the DTRM node. In addition, the DTRM node may receive sensor data from fixed sensors in the coverage area of the DTRM node. Furthermore, based on the mobility of the AVs connected to the DTRM node (i.e. , the predicted or estimated location of the AVs), the DTRM node may receive sensor data from AVs outside of the coverage area of the DTRM node, as well as fixed sensors outside of the coverage area of the DTRM node. The DTRM node receives information from information services (block 907). The DTRM node receives information from services, such as topography information, geography information, navigation system information, traffic information, emergency services information, weather information, and so on.
The DTRM node optionally augments the sensor data (block 909). In a situation where there are faulty or damaged sensors, the DTRM node augments the sensor data from the faulty or damaged sensors (or the missing sensor data from the faulty or damaged sensors) with sensor data from other sensors that are closely located to the faulty or damaged sensors. In an embodiment, in a situation where there are no faulty or damaged sensors, the DTRM node augments the sensor data from AVs, for example, with sensor data from sensors that are closely located to the AVs. In general, additional sensor data helps improve the quality of support provided to the AVs. For example, additional sensor data improves the quality of the sensor information, potentially resulting in higher quality models and maps.
The DTRM node generates models (block 911). The DTRM node generates the traffic, weather, and road models from the sensor data and the information from the information services. The DTRM node shares the models (block 913). The DTRM node shares information associated with the models to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV. As an example, instead of providing all information of the models to the AVs, the DTRM node shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps to reduce the amount of information being shared. The DTRM node generates a 3D map (block 915). The 3D map is generated from the sensor data and the information from the information services. The DTRM node shares the 3D map (block 917). The DTRM node shares information associated with the 3D map to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV. As an example, instead of providing all information of the 3D map to the AVs, the DTRM node shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps to reduce the amount of information being shared.
Figure 10 is a flow diagram of example operations 1000 occurring in a DTRM node updating models and the 3D map. Operations 1000 may be indicative of operations occurring in a DTRM node as the DTRM node updates the traffic, weather, and road models, as well as the 3D map, and providing the models and 3D map to the AVs. Furthermore, the DTRM node augments sensor data from faulty or damaged sensors with sensor data from other sensors.
Operations 1000 begin with the DTRM node receiving updated sensor data (block 1005). The sensor data may be received from AVs connected to the DTRM node. In addition, the DTRM node may receive updated sensor data from fixed sensors in the coverage area of the DTRM node. Furthermore, based on the mobility of the AVs connected to the DTRM node (i.e., the predicted or estimated location of the AVs), the DTRM node may receive updated sensor data from AVs outside of the coverage area of the DTRM node, as well as fixed sensors outside of the coverage area of the DTRM node. The DTRM node receives updated information from information services (block 1007). The DTRM node receives updated information from services, such as topography information, geography information, navigation system information, traffic information, emergency services information, weather information, and so on.
The DTRM node optionally augments the updated sensor data (block 1009). In a situation where there are faulty or damaged sensors, the DTRM node augments the updated sensor data from the faulty or damaged sensors (or the missing sensor data from the faulty or damaged sensors) with updated sensor data from other sensors that are closely located to the faulty or damaged sensors. In an embodiment, in a situation where there are no faulty or damaged sensors, the DTRM node augments the updated sensor data from AVs, for example, with updated sensor data from sensors that are closely located to the AVs. In general, additional sensor data helps improve the quality of support provided to the AVs. For example, additional sensor data improves the quality of the sensor information, potentially resulting in higher quality models and maps. The DTRM node updates the models (block ton). The DTRM node updates the traffic, weather, and road models from the updated sensor data and the updated information from the information services. The DTRM node shares the updated models (block 1013). The DTRM node shares information associated with the updated models to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV. As an example, instead of providing all information of the updated models to the AVs, the DTRM node shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps reduce the amount of information being shared.
The DTRM node updates the 3D map (block 1015). The 3D map is updated from the updated sensor data and the updated information from the information services. The DTRM node shares the updated 3D map (block 1017). The DTRM node shares information associated with the updated 3D map to the AVs, where the information shared with a particular AV is in accordance with the geolocation information of the AV. As an example, instead of providing all information of the updated 3D map to the AVs, the DTRM node shares only information relevant to the AVs based on the respective geolocation information of the AVs. Sharing only relevant information helps to reduce the amount of information being shared. The information about the models and/or the 3D maps shared with the AVs may be customizable for each AV based on their request, location, hardware configurations or connection bandwidth. The DTRM node may constantly or periodically receive sensor data and information from the information services, and update the models and the 3D map. The timing, continuation or frequency of compiling and supplying sensor data from sensors, and/or sharing modes and the 3D maps may be configurable.
The models and 3D map generated by a DTRM node may be used by the DTRM node, AVs (who are shared with the models and 3D map) or other DTRM nodes (e.g., a neighboring DTRM node exchanging modes and 3D maps with the DTRM node) in various applications. As an example, based on the models and 3D map, a DTRM node may track traffic situations of an area and provide timely traffic reports (e.g., traffic jam), accidents reports (e.g., car accident), and incident reports (e.g., road construction, traffic sign issues, road conditions, etc.). As another example, a DTRM node may, based on the models and 3D map, may detect abnormality of an AV, and/or a sensor (a mobile sensor or a fixed sensor). For example, other cars are moving but one car is still, or traffic light is red but one sensor shows green, etc. An AV may use the models and 3D map to “see” images or objects that are blocked by a front car. An AV may also obtain enhanced information for driving from the models and 3D maps, although its embedded or pre- installed sensor could be veiy powerful and updated. Each AV may make its independent planning and decision based on the models and 3D map provided by DTRM node. The 3D map represents multi-dimensional information, including information about road, traffic, and weather, etc., in a boundary of 3D space and time.
In some embodiments, both AVs and DTRM (MEC) nodes may support fault detection of sensors. If hardware failure occurs on an AV, the AV system may detect the failure and send a request to DTRM node(s) for help. A DTRM node may detect faults or failures happening to the AV, based on its knowledge (models), e.g., videos or pictures of a sensor are black out or have decode errors. The DTRM node may advise the AV about potential faults or failures. The DTRM node may provide information or data which is needed by the faulty or failed sensor as inputs. It may provide extra information or data which is compiled from multiple sources, e.g., other AVs, sensors on smart infrastructure, and so on.
Figure 11 is a flow diagram of embodiment operations 1100 for fault sensor detection. Operations 1100 maybe indicative of operations occurring in an AV. Operations 1100 begin with the AV detecting that a sensor has a hardware failure (block 1105). That is, the sensor does not work. The sensor may be referred to as a failed sensor in the following description. The AV may detect the failure when, e.g., no sensor data/ signal is received from the sensor, a video or a picture from the failed sensor is black out or has a decode error, and so on. The AV may send a request to a DTRM node for help (block 1107). The DTRM node may be one that the AV is being connected with. The request may include information of the failed sensor. The request may send may request for recovery of sensor data of the failed sensor by the DTRM node. That is, the AV may request the DTRM node to provide sensor data that can be used in place of sensor data of the failed sensor. The DTRM node may acknowledge the request (block 1109). A message exchange channel may be set up between the DTRM node and the AV, and the message exchange channel may be marked with a realtime requirement tag (block 1111). The message exchange channel is used for exchanging messages between the DTRM node and the AV in response to the request of the AV. The DTRM node may determine to obtain sensor data from one or more other sensors (referred to as replacement sensors) to be used in place of sensor data of the failed sensor. The replacement sensors may include a fixed sensor and/or a sensor from another AV. A replacement sensor may be the same sensor type as the faulty sensor. The failed sensor and a replacement sensor may have the same or similar sensing range. The failed sensor and a replacement sensor may be in close proximity (e.g., the location of the failed sensor is within a specified threshold of the location of the replacement sensor). The DTRM node may send sensor data of the replacement sensors to the AV using the message exchange channel. The AV may use the received sensor data in place of the sensor data of the failed sensor.
Figure 12 is a flow diagram of embodiment operations 1200 for fault sensor detection. Operations 1200 maybe indicative of operations occurring in a DTRM node. In this example, a sensor of an AV does not have a hardware failure, but the sensor signals are abnormal, e.g., the sensor is blocked/ covered by mud, snow, another object or another vehicle. The AV may send sensor data of its sensors to the DTRM node. The DTRM node receives the sensor data and detects that a signal or signals of the sensor is blacked out or unavailable (block 1205). The DTRM node may send a notification to the AV, notifying the AV of the detection and may offer sensor input from one or more other sensors (e.g., from a fixed sensor or a sensor of another AV) (block 1207). In response, the AV may check and confirm the issue of the sensor and open an air socket ID of a socket for wireless communication (block 1209). When the AV confirms that the sensor has an issue, it may determine to communicate with the DTRM node regarding the issue, and open a dedicated real-time communication channel for this sensor, so that the AV receives DTRM data to replace the data of a malfunctioning sensor or sensors. A message exchange channel may then be set up between the DTRM node and the AV and associated with the air socket ID, and the message exchange channel may be marked with a real-time requirement tag (block 1211). The message exchange channel is used for exchanging messages between the DTRM node and the AV in response to the notification from the DTRM node.
Figure 13 is a block diagram of a computing system 1300 according to an embodiment. The computing system 1300 may be used for implementing the devices and methods disclosed herein. For example, the computing system can be a DTRM node or a DTRM site. Specific devices may utilize all of the components shown or only a subset of the components, and levels of integration may vaiy from device to device. Furthermore, a device may contain multiple instances of a component, such as multiple processing units, processors, memories, transmitters, receivers, etc. The computing system 1300 includes a processing unit 1302. The processing unit 1302 includes a central processing unit (CPU) 1314, memoiy 1308, and may further include a mass storage device 1304, a video adapter 1310, and an I/O interface 1312, with some or all of the components being connected to a bus 1320. The memoiy 1308 stores instructions 1330, which are executable by one or more processors, such as the CPU 1314.
The bus 1320 may be one or more of any type of several bus architectures including a memoiy bus or memoiy controller, a peripheral bus, or a video bus. The CPU 1314 may comprise any type of electronic data processor. The memory 1308 may comprise any type of non-transitoiy system memoiy such as static random access memoiy (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), read-only memoiy (ROM), or a combination thereof. In an embodiment, the memory 1308 may include ROM for use at boot-up, and DRAM for program and data storage for use while executing programs.
The mass storage 1304 may comprise any type of non-transitoiy storage device configured to store data, programs, and other information and to make the data, programs, and other information accessible via the bus 1320. The mass storage 1304 may comprise, for example, one or more of a solid state drive, hard disk drive, a magnetic disk drive, or an optical disk drive.
The video adapter 1310 and the I/O interface 1312 provide interfaces to couple external input and output devices to the processing unit 1302. As illustrated, examples of input and output devices include a display 1318 coupled to the video adapter 1310 and a mouse, keyboard, or printer 1316 coupled to the I/O interface 1312. Other devices maybe coupled to the processing unit 1302, and additional or fewer interface cards may be utilized. For example, a serial interface such as Universal Serial Bus (USB) (not shown) may be used to provide an interface for an external device.
The processing unit 1302 also includes one or more network interfaces 1306, which may comprise wired links, such as an Ethernet cable, or wireless links to access nodes or different networks. The network interfaces 1306 allow the processing unit 1302 to communicate with remote units via the networks. For example, the network interfaces 1306 may provide wireless communication via one or more transmitters/transmit antennas and one or more receivers/ receive antennas. In an embodiment, the processing unit 1302 is coupled to a local-area network (LAN) 1322 or a wide-area network (WAN) for data processing and communications with remote devices, such as other processing units, the Internet, or remote storage facilities. It should be appreciated that one or more steps of the embodiment methods provided herein may be performed by corresponding circuits, units, or modules. For example, a signal may be transmitted by a transmitting circuit, unit, or module. A signal may be received by a receiving circuit, unit, or module. A signal may be processed by a processing circuit, unit, or module. The respective circuits, units, or modules may be hardware, software, or a combination thereof. For instance, one or more of the circuits, units, or modules may be an integrated circuit, such as field programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs). In an embodiment, the computing system 1300 comprises a MEC node and the processing unit 1302 executes the instructions 1330 to receive mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area, receive traffic data and weather data of the geographical area. The processing unit 1302 executes the instructions 1330 to generate real-time road, traffic, and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data. The processing unit 1302 executes the instructions 1330 to generate a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic and weather models. The processing unit 1302 executes the instructions 1330 to share, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
In an embodiment, the MEC node includes a reception module receiving mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area, receive traffic data and weather data of the geographical area, a generation module generating real-time road, traffic, and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data, a map module generating a three- dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic and weather models, and a share module sharing with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV. In some embodiments, the MEC node may include other or additional modules for performing any one of or combination of steps described in the embodiments. Further, any of the additional or alternative embodiments or aspects of the method, as shown in any of the figures or recited in any of the claims, are also contemplated to include similar modules.
Although the present disclosure and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the scope of the disclosure as defined by the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A method implemented by a first multi-access edge computing (MEC) node, the method comprising: receiving mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area; receiving traffic data and weather data of the geographical area; generating real-time road, traffic and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data; generating a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographic area in accordance with the real-time road, traffic, and weather models; and sharing, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
2. The method of claim 1, the mobile sensor data being received from the at least one AV.
3. The method of claim 1, the mobile sensor data being received from a second AV within the geographic area.
4. The method of claim 1, the mobile sensor data being received from one or both of the at least one AV and a second AV within the geographic area.
5. The method of any of claims 1-4, the traffic data and the weather data being received from at least one of sensors or information services.
6. The method of any of claims 1-5, further comprising sharing, with the at least one AV, the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
7. The method of claim 6, the sharing the real-time road, traffic, and weather models comprising sharing a portion of the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
-29-
8. The method of any of claims 1-7, further comprising sharing, with a second MEC node, the 3D map in accordance with geolocation information associated with the second MEC node.
9. The method of claim 8, further comprising sharing, with the second MEC node, one or more of: the real-time road, traffic, and weather models, in accordance with the geolocation information associated with the second MEC node.
10. The method of any of claims 1-9, the sharing the 3D map comprising sharing a portion of the 3D map in accordance with the geolocation information associated with the at least one AV.
11. The method of any of claims 1-10, further comprising: receiving an update of at least one of: the fixed sensor data, the mobile sensor data, the traffic data, or the weather data; updating the 3D map in accordance with the update; and sharing, with the at least one AV, the updated 3D map in accordance with the geolocation information associated with the at least one AV.
12. The method of any of claims 1-11, further comprising determining a presence of a faulty mobile sensor in the one or more mobile sensors or a faulty fixed sensor in the one or more fixed sensors, and based thereon, replacing sensor data associated with the faulty mobile sensor or the faulty fixed sensor with sensor data associated with a sensor within a specified distance from the faulty mobile sensor or the faulty fixed sensor.
13. The method of any of claims 1-12, further comprising receiving a request from a first AV connected with the MEC node, requesting the MEC node provide sensor data in place of sensor data of a faulty sensor of the first AV.
14. The method of any of claims 1-13, further comprising determining a fixed sensor or a mobile sensor whose sensor data is to be used in place of the sensor data of the faulty sensor of the first AV.
-30-
15. A multi-access edge computing (MEC) node comprising: a non-transitory memory storing instructions; and at least one processor in communication with the memory, the at least one processor configured, upon execution of the instructions, to: receive mobile sensor data of one or more mobile sensors in a geographical area and fixed sensor data of one or more fixed sensors in the geographical area; receive traffic data and weather data of the geographical area; generate real-time road, traffic, and weather models of the geographical area in accordance with the mobile sensor data, the fixed sensor data, the traffic data, and the weather data; generate a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic and weather models; and share, with at least one autonomous vehicle (AV), at least a portion of the 3D map in accordance with geolocation information associated with the at least one AV.
16. The MEC node of claim 15, the mobile sensor data being received from the at least one AV.
17. The MEC node of claim 15, the mobile sensor data being received from a second AV within the geographic area.
18. The MEC node of claim 15, the mobile sensor data being received from one or both of the at least one AV and a second AV within the geographic area.
19. The MEC node of any one of claims 15-18, the traffic data and the weather data being received from at least one of sensors or information services.
20. The MEC node of any one of claims 15-19, the instructions causing the MEC node to share, with the at least one AV, the real-time road, traffic and weather models in accordance with the geolocation information associated with the at least one AV.
21. The MEC node of claim 20, the instructions causing the MEC node to share a portion of the real-time road, traffic, and weather models in accordance with the geolocation information associated with the at least one AV.
22. The MEC node of any of claims 15-21, the instructions further causing the MEC node to share, with a second MEC node, the 3D map in accordance with geolocation information associated with the second MEC node.
23. The MEC node of any of claims 22, the instructions further causing the MEC node to share, with the second MEC node, one or more of: the real-time road, traffic, and weather models, in accordance with the geolocation information associated with the second MEC node.
24. The MEC node of any one of claims 15-23, the 3D map comprising information of the at least one AV.
25. The MEC node of any one of claims 15-24, the instructions causing the MEC node to: receive an update of at least one of: the fixed sensor data, the mobile sensor data, the traffic data, or the weather data; update the 3D map in accordance with the update; and share, with the at least one AV, the updated 3D map in accordance with the geolocation information associated with the at least one AV.
26. The MEC node of any one of claims 15-25, the instructions causing the MEC node to: determine a presence of a faulty mobile sensor in the one or more mobile sensors or a faulty fixed sensor in the one or more fixed sensors; and based thereon, replace sensor data associated with the faulty mobile sensor or the faulty fixed sensor with sensor data associated with a sensor within a specified distance from the faulty mobile sensor or the faulty fixed sensor.
27. The MEC node of any one of claims 15-26, the instructions further causing the MEC node to receive a request from a first AV connected with the MEC node, the request requesting the MEC node provide sensor data in place of sensor data of a faulty sensor of the first AV.
28. The MEC node of claim 15-27, the instructions further causing the MEC node to determine a fixed sensor or a mobile sensor whose sensor data is to be used in place of the sensor data of the faulty sensor of the first AV.
29. A first multi-access edge computing (MEC) node comprising: a messaging agent configured to communicate messages with a second MEC node, configured to communicate messages with at least one autonomous vehicle (AV) operating within coverage of the first MEC node, configured to receive sensor data from sensors in a geographical area, and configured to receive data from information services; a modeling service operatively coupled to the messaging agent, the modeling service configured to generate real-time road, traffic, and weather models of the geographical area in accordance with the sensor data and the data received from the information services; and a fusion service operatively coupled to the modeling service, the fusion service configured to generate a three-dimensional (3D) map representing road, traffic, and weather conditions of the geographical area in accordance with the real-time road, traffic, and weather models.
30. The first MEC node of claim 29, the fusion service configured to validate sensor data received from mobile sensors and fixed sensors in the geographical area, track AVs operating within the coverage of the first MEC node, and maintain a connection state with the at least one AV.
31. The first MEC node of any one of claims 29-30, the messaging agent configured to receive updated sensor data or updated data from the information services.
32. The first MEC node of claim 31, the modeling service configured to update the real-time road, traffic, or weather models in accordance with the updated sensor data or the updated data from the information services.
33. The first MEC node of claim 32, the fusion service configured to update the 3D map in accordance with the updated real-time road, traffic, or weather models.
34. The first MEC node of claim 33, the messaging agent configured to share the updated 3D map in accordance with geolocation information associated with the at least one AV.
35. The first MEC node of claim 33, the messaging agent configured to share, with the second MEC node, the updated 3D map in accordance with geolocation information associated with the second MEC node.
-33-
36. The first MEC node of claim 32, the messaging agent configured to share, with the second MEC node, the updated real-time road, traffic, or weather models in accordance with geolocation information associated with the second MEC node.
-34-
PCT/US2022/017053 2022-02-18 2022-02-18 Methods and apparatus for supporting autonomous vehicles in multi-edge computing systems WO2022082230A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/017053 WO2022082230A2 (en) 2022-02-18 2022-02-18 Methods and apparatus for supporting autonomous vehicles in multi-edge computing systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/017053 WO2022082230A2 (en) 2022-02-18 2022-02-18 Methods and apparatus for supporting autonomous vehicles in multi-edge computing systems

Publications (3)

Publication Number Publication Date
WO2022082230A2 true WO2022082230A2 (en) 2022-04-21
WO2022082230A9 WO2022082230A9 (en) 2022-06-16
WO2022082230A3 WO2022082230A3 (en) 2022-11-17

Family

ID=80928913

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/017053 WO2022082230A2 (en) 2022-02-18 2022-02-18 Methods and apparatus for supporting autonomous vehicles in multi-edge computing systems

Country Status (1)

Country Link
WO (1) WO2022082230A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230322241A1 (en) * 2022-04-06 2023-10-12 Ghost Autonomy Inc. Implementing degraded performance modes in an autonomous vehicle

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2724353T3 (en) * 2014-04-04 2019-09-10 Signify Holding Bv System and methods for the support of autonomous vehicles through environmental perception and sensor calibration and verification
WO2020145441A1 (en) * 2019-01-11 2020-07-16 엘지전자 주식회사 Electronic device for vehicle and method for operating electronic device for vehicle
US10992752B2 (en) * 2019-03-28 2021-04-27 Intel Corporation Sensor network configuration mechanisms
EP3913597A1 (en) * 2020-05-21 2021-11-24 Deutsche Telekom AG Vehicle operation with a central digital traffic model
EP4229508A2 (en) * 2020-11-23 2023-08-23 Huawei Technologies Co., Ltd. Methods and apparatus for supporting application mobility in multi-access edge computing platform architectures

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230322241A1 (en) * 2022-04-06 2023-10-12 Ghost Autonomy Inc. Implementing degraded performance modes in an autonomous vehicle

Also Published As

Publication number Publication date
WO2022082230A9 (en) 2022-06-16
WO2022082230A3 (en) 2022-11-17

Similar Documents

Publication Publication Date Title
Raza et al. A survey on vehicular edge computing: architecture, applications, technical issues, and future directions
CN108447291B (en) Intelligent road facility system and control method
US11815617B2 (en) Generation and use of HD maps
US11520331B2 (en) Methods and apparatus to update autonomous vehicle perspectives
US20210163021A1 (en) Redundancy in autonomous vehicles
US10518770B2 (en) Hierarchical motion planning for autonomous vehicles
WO2019052327A1 (en) Transportation information processing method and related device
JP2021501425A (en) Cellular network-based driving support method and traffic control unit
US10839682B1 (en) Method and system for traffic behavior detection and warnings
US11895566B2 (en) Methods of operating a wireless data bus in vehicle platoons
JPWO2019077999A1 (en) Image pickup device, image processing device, and image processing method
US20240005779A1 (en) Autonomous vehicle cloud system
WO2019042592A1 (en) A vehicle control method, devices and system
US20210204188A1 (en) Mobility information provision system for mobile bodies, server, and vehicle
US20230415762A1 (en) Peer-to-peer occupancy estimation
US11080999B2 (en) Traffic application instance processing method and traffic control unit
US20210280064A1 (en) System and method for location data fusion and filtering
WO2022082230A2 (en) Methods and apparatus for supporting autonomous vehicles in multi-edge computing systems
US20220032934A1 (en) Method, apparatus, device and system for controlling driving
CN113811930A (en) Information processing apparatus, information processing method, and program
EP3972227A1 (en) Method and apparatus for driving control, device, medium and system
US20220148430A1 (en) Sharing traveled pathway data
WO2021253374A1 (en) V2X Message For Platooning
JP2020154631A (en) Remote controller and automatic driving system
US20230339480A1 (en) Methods and apparatuses for transient fault detection