US20210049903A1 - Method and apparatus for perception-sharing between vehicles - Google Patents
Method and apparatus for perception-sharing between vehicles Download PDFInfo
- Publication number
- US20210049903A1 US20210049903A1 US16/751,804 US202016751804A US2021049903A1 US 20210049903 A1 US20210049903 A1 US 20210049903A1 US 202016751804 A US202016751804 A US 202016751804A US 2021049903 A1 US2021049903 A1 US 2021049903A1
- Authority
- US
- United States
- Prior art keywords
- connected vehicle
- vehicle
- objects
- spatial
- monitoring system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title abstract description 10
- 238000004891 communication Methods 0.000 claims abstract description 50
- 230000033001 locomotion Effects 0.000 claims abstract description 23
- 238000012544 monitoring process Methods 0.000 claims description 35
- 230000008447 perception Effects 0.000 claims description 23
- 230000000007 visual effect Effects 0.000 claims description 6
- 230000004927 fusion Effects 0.000 abstract description 13
- 230000005540 biological transmission Effects 0.000 description 29
- 238000012545 processing Methods 0.000 description 13
- 238000004422 calculation algorithm Methods 0.000 description 12
- 239000000872 buffer Substances 0.000 description 11
- 230000000116 mitigating effect Effects 0.000 description 10
- 238000012725 vapour phase polymerization Methods 0.000 description 10
- 238000013461 design Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 238000003860 storage Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000001953 sensory effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 230000009897 systematic effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000000873 masking effect Effects 0.000 description 2
- 238000013138 pruning Methods 0.000 description 2
- 230000036962 time dependent Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003319 supportive effect Effects 0.000 description 1
- 239000004557 technical material Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
- G05D1/0291—Fleet control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096783—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/027—Services making use of location information using location based information parameters using movement velocity, acceleration information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/06—Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services
- H04W4/08—User group management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/20—Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
Definitions
- the application-layer routine includes collecting real-time data associated with a plurality of objects from each of the plurality of similarly-situated vehicles traveling on the portion of the roadway, predicting motion of each of the plurality of objects based upon the real-time data from the plurality of similarly-situated vehicles, object-matching the motion of each of the plurality of objects, wherein the object-matching is subjected to a time constraint, and executing fusion of the plurality of objects based upon the object-matching of the motion of each of the plurality of objects.
- Locations of the similarly-situated vehicles traveling on the portion of the roadway are identified based upon the fusion of the plurality of objects.
- the locations of the similarly-situated vehicles traveling on the portion of the roadway are communicated to one of the similarly-situated vehicles.
- Another aspect of the disclosure includes collecting real-time data related to distance, a visual descriptor, a lane-level lateral position, and a speed estimation of each of the plurality of objects, a corresponding time stamp, and a geo-spatial positioning of the respective one of the plurality of similarly-situated vehicles.
- Another aspect of the disclosure includes executing blind area monitoring by at least one of the plurality of similarly-situated vehicles based upon the fusion of the plurality of objects.
- Another aspect of the disclosure includes the similarly-situated vehicles being a plurality of vehicles that are travelling in the same direction on the portion of the roadway.
- the software routine includes an instruction set that is executable to collect real-time data associated with a plurality of objects from each of a plurality of similarly-situated vehicles traveling on the portion of the roadway, predict motion of each of the plurality of objects based upon the real-time data from the plurality of similarly-situated vehicles, object-match the motion of each of the plurality of objects, wherein the object-matching is subjected to a time constraint, and fuse the plurality of objects based upon the object-matching of the motion of each of the plurality of objects.
- Locations of the plurality of similarly-situated vehicles traveling on the portion of the roadway are identified based upon the fused plurality of objects and communicated, via the roadside unit, to one of the similarly-situated vehicles.
- Another aspect of the disclosure includes the application-layer software routine being arranged to be executed by a multi-edge computing cluster.
- Another aspect of the disclosure includes communicating, via the roadside unit, the locations of the similarly-situated vehicles traveling on the portion of the roadway to one of the similarly-situated vehicles, wherein the one of the similarly-situated vehicles includes an advanced driver-assistance system (ADAS).
- ADAS advanced driver-assistance system
- Another aspect of the disclosure includes controlling the one of the similarly-situated vehicles that includes an advanced driver-assistance system based upon the locations of the similarly-situated vehicles traveling on the portion of the roadway.
- FIG. 1 schematically illustrates a plurality of vehicles travelling on a multi-lane highway that is equipped with an embodiment of an intelligent vehicle/highway system (IVHS), in accordance with the disclosure.
- IVHS intelligent vehicle/highway system
- FIG. 2 schematically illustrates, in flowchart form, a Multi-Access Edge Computing (MEC)-based perception-sharing routine for cooperative perception of a driving environment, in accordance with the disclosure.
- MEC Multi-Access Edge Computing
- FIG. 3 schematically illustrates a timeline that indicates time-dependent tasks that are performed as part of an object matching routine, in accordance with the disclosure.
- FIG. 1 schematically illustrates a plurality of vehicles 30 that are travelling on a portion of a multi-lane highway 50 that is equipped with an embodiment of an intelligent vehicle/highway system (IVHS) 100 that includes a Multi-Access Edge Computing (MEC) cluster 10 and a plurality of road-side units (RSUs) 20 .
- the MEC cluster 10 includes a Multi-Access Edge Computing (MEC) perception-sharing routine (routine) 200 , as described with reference to FIGS. 2 and 3 .
- MEC Multi-Access Edge Computing
- Each of the vehicles 30 may include, but not be limited to a mobile platform in the form of a commercial vehicle, industrial vehicle, agricultural vehicle, passenger vehicle, aircraft, watercraft, train, all-terrain vehicle, personal movement apparatus, robot and the like to accomplish the purposes of this disclosure.
- a subset of the vehicles 30 may be connected vehicles 40 .
- Connected vehicles 40 are equipped with a spatial monitoring system 44 and a telematics communication system 42 that is capable of wireless extra-vehicle communications.
- One or more of the connected vehicles 40 may include an advanced driver-assistance system (ADAS) 46 .
- ADAS advanced driver-assistance system
- the telematics communication system 42 is capable of extra-vehicle communications, including communicating with a communication network system that may include wireless and wired communication capabilities.
- the telematics communication system 42 includes a telematics controller that is capable of extra-vehicle communications that includes vehicle-to-everything (V2X) communication.
- V2X communication includes short-range vehicle-to-vehicle (V2V) communication, and communication to one or more of the RSUs 20 , thus facilitating localized communication between a plurality of similarly-situated vehicles that are moving parts of the IVHS 100 .
- the telematics communication system 42 is capable of short-range wireless communication to a handheld device, e.g., a cell phone, a satellite phone or another telephonic device.
- a handheld device e.g., a cell phone, a satellite phone or another telephonic device.
- the handheld device is loaded with a software application that includes a wireless protocol to communicate with the telematics controller, and the handheld device executes the extra-vehicle communication, including communicating with an off-board controller via a communication network.
- the telematics controller executes the extra-vehicle communication directly by communicating with the off-board controller via the communication network.
- the spatial monitoring system 44 includes a plurality of spatial sensors that are in communication with a spatial monitoring controller, wherein each of the spatial sensors is disposed on-vehicle to monitor a field-of-view surrounding the connected vehicle 40 , including other vehicles 30 that are proximal to the connected vehicle 40 .
- the spatial monitoring controller generates digital representations of each of the fields of view including the proximal vehicles 30 based upon data inputs from the spatial sensors.
- the spatial monitoring controller can evaluate inputs from the spatial sensors to determine a linear range, relative speed, and trajectory of the connected vehicle 40 in view of each of the proximal vehicles 30 .
- the spatial sensors can be located at various locations on the connected vehicle 40 , including the front corners, rear corners, rear sides and mid-sides.
- the spatial sensors can include a front radar sensor and a camera in one embodiment, although the disclosure is not so limited. Placement of the aforementioned spatial sensors permits the spatial monitoring controller to monitor traffic flow including proximate vehicles and other objects in the vicinity of the connected vehicle 40 .
- the spatial sensors of the spatial monitoring system 44 can further include object-locating sensing devices including range sensors, such as FM-CW (Frequency Modulated Continuous Wave) radars, pulse and FSK (Frequency Shift Keying) radars, and Lidar (Light Detection and Ranging) devices, and ultrasonic devices which rely upon effects such as Doppler-effect measurements to locate forward objects.
- the possible object-locating devices include charged-coupled devices (CCD) or complementary metal oxide semi-conductor (CMOS) video image sensors, and other camera/video image processors which utilize digital photographic methods to ‘view’ forward objects including one or more vehicle(s).
- Each of the connected vehicles 40 includes a propulsion system, a wheel braking system, and a steering system.
- the operation of the propulsion system, the wheel braking system, and the steering system may be controlled by direct interaction with a vehicle operator alone, or in combination with the ADAS 46 , employing inputs from the spatial monitoring system 44 .
- the ADAS 46 is arranged to provide operator assistance features by controlling one or more of the propulsion system, the wheel braking system, and the steering system with little or no direct interaction of the vehicle operator.
- the ADAS 46 includes a controller and one or a plurality of subsystems that provide operator assistance features, including one or more of an adaptive cruise control (ACC) system, a lane-keeping control (LKY) system, a lane change control (LCC) system, an autonomous braking/collision avoidance system, and/or other systems that are configured to command and control autonomous vehicle operation separate from or in conjunction with operator requests.
- ACC adaptive cruise control
- LY lane-keeping control
- LCC lane change control
- autonomous braking/collision avoidance system an autonomous braking/collision avoidance system
- the ADAS 46 may interact with and access information from an on-board map database for route planning and to control operation of the connected vehicle 40 via the lane-keeping system, the lane-centering system, and/or other systems that are configured to command and control autonomous vehicle operation.
- Autonomous operating commands may be generated to control the ACC system, the LKY system, the LCC system, the autonomous braking/collision avoidance system, and/or the other systems.
- Vehicle operation includes operation in a propulsion mode in response to desired commands, which can include operator requests and/or autonomous vehicle requests.
- Vehicle operation, including autonomous vehicle operation includes acceleration, braking, steering, steady-state running, coasting, and idling. Operator requests can be generated based upon operator inputs to an accelerator pedal, a brake pedal, a steering wheel, a transmission range selector, the ACC system, etc.
- each of the connected vehicles 40 traveling on the portion of the multi-lane highway 50 may be capable of detecting one or more of the other vehicles 30 that are proximal thereto employing inputs from the spatial monitoring system 44 .
- some of the vehicles 30 proximal thereto may be undetectable due to masking caused by other, intervening vehicles 30 .
- Masking by intervening vehicles 30 causes the blind areas that cannot be perceived by one or more of the connected vehicles.
- the IVHS 100 includes the MEC cluster 10 , which may be remotely-located and is in communication with the plurality of road-side units (RSUs) 20 , and can be configured to monitor locations, speeds and trajectories of a plurality of vehicles 30 , including a plurality of similarly-situated ones of the vehicles 30 that are travelling on the portion of the multi-lane highway 50 .
- Similarly-situated vehicles are those vehicles 30 that are travelling in the same direction on the same portion of the multi-lane highway 50 .
- the same portion of the multi-lane highway 50 includes a portion of the multi-lane highway 50 that is within communication range of one of the RSUs 20 .
- the MEC cluster 10 includes a cloud-based IT (information technology) service environment located at the edge of a network.
- the purpose of edge computing and the MEC cluster 10 is to bring real-time, high-bandwidth, low-latency access to latency-dependent applications that are distributed at the edge of the network. Since edge computing is closer to the end user and apps, it allows for localized and cloud-based applications. Edge computing reduces network congestion and improves application performance by executing related task processing closer to the end user, i.e., the connected vehicle 40 , improving the delivery of content and applications to those users.
- the MEC cluster 10 moves the computing of traffic and services from a centralized cloud to the edge of the network and closer to the connected vehicle 40 , and the network edge analyzes, processes, and stores the data.
- Characteristics of the MEC cluster 10 and road-side units (RSUs) 20 include proximity, ultra-low latency, high bandwidth, and virtualization.
- RSUs road-side units
- the connected vehicle 40 When deployed on-vehicle, the connected vehicle 40 is able to constantly sense driving patterns, road conditions and other vehicle movements to provide guidance to the vehicle operator and the ADAS 46 .
- Most of the predictive and prescriptive insights need to be provided in a timely manner, which means that data from the spatial monitoring system 44 needs to be collected, processed and analyzed by the MEC cluster 10 to provide low latency insights to the vehicle operator and the ADAS 46 .
- the MEC cluster 10 includes an application-layer software architecture and algorithm design that enables efficient processing for cooperative perception of the driving environment on the portion of the multi-lane highway 50 , with the original perception results provided by individual ones of the connected vehicles 40 , and the fused perception results provided by the RSUs 20 following the fusion tasks performed by MEC cluster 10 .
- the core tasks include identifying overlap between perception results from a plurality of similarly-situated vehicles, mitigating processing latency, and providing data usable by a wide range of applications to a plurality of functionality modules. Applications may be related to traffic monitoring, detection of congestions, detection of emergency vehicles and roadside assistance vehicles, access and parking system operation, enforcement systems, multi-lane and single lane free flow systems, etc.
- the MEC cluster 10 provides a systematic solution for perception-sharing among connected vehicles and infrastructure through multi-access edge computing (MEC), including an architecture and algorithm design of software modules for fusion tasks to be performed efficiently.
- MEC multi-access edge computing
- the MEC cluster 10 can be used to support a wide range of applications carried by individual vehicles or infrastructure sites, with negligible errors in real-time. This enables blind area monitoring, detection, and mitigation for individual vehicles and other applications based on sensory information that is obtained over V2X, with effective approach for latency mitigation and ensuring robustness against high populated real-time traffic situations.
- the road-side units (RSUs) 20 are transceivers configured for Dedicated Short-Range Communications (DSRC) that may be mounted along a road or a pedestrian passageway. RSUs 20 communicate using short-range, low-power data transmissions of limited duration. The main function of each of the RSUs 20 is to facilitate the communication between vehicles, transportation infrastructure, and other devices by transferring data over DSRC in accordance with industry standards, e.g., SAE Standard J2735 (SAE J2735—Dedicated Short Range Communications (DSRC) Message Set Dictionary).
- SAE Standard J2735 SAE J2735—Dedicated Short Range Communications (DSRC) Message Set Dictionary
- the RSUs 20 are integrated into and communicate with the MEC cluster 10 . Each of the RSUs 20 may be in communication with a traffic monitoring fixture (not shown), such a roadside camera or another device.
- Each RSU 20 broadcasts data to or exchanges data with connected vehicles 40 that are disposed within its communication zone and provides channel assignments and operating instructions to it. The connected vehicles 40 receive
- GUI graphical user interface
- the concepts described herein facilitate connected intelligent driving (CID), in the framework of V2X perception-sharing, together with payload functionality modules, benefitting from common APIs, and V2X perception-sharing algorithms driven by the principle of efficient utilization of computing resources, and also the consideration for the need of scalability with real-time traffic flow as well as future wireless bandwidth expansion.
- CID connected intelligent driving
- the software architecture is tailored with a feature of parallelizing multiple tasks which potentially have conflicts, while the conflicts can be effectively avoided by making use of an event-triggered conflict look-up mechanism.
- controller and related terms such as microcontroller, control module, module, control, control unit, processor and similar terms refer to one or various combinations of Application Specific Integrated Circuit(s) (ASIC), Field-Programmable Gate Array (FPGA), electronic circuit(s), central processing unit(s), e.g., microprocessor(s) and associated non-transitory memory component(s) in the form of memory and storage devices (read only, programmable read only, random access, hard drive, etc.).
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- central processing unit e.g., microprocessor(s) and associated non-transitory memory component(s) in the form of memory and storage devices (read only, programmable read only, random access, hard drive, etc.).
- the non-transitory memory component is capable of storing machine readable instructions in the form of one or more software or firmware programs or routines, combinational logic circuit(s), input/output circuit(s) and devices, signal conditioning and buffer circuitry and other components that can be accessed by one or more processors to provide a described functionality.
- Input/output circuit(s) and devices include analog/digital converters and related devices that monitor inputs from sensors, with such inputs monitored at a preset sampling frequency or in response to a triggering event.
- Software, firmware, programs, instructions, control routines, code, algorithms and similar terms mean controller-executable instruction sets including calibrations and look-up tables. Each controller executes control routine(s) to provide desired functions.
- Routines may be executed at regular intervals, for example each 100 microseconds during ongoing operation. Alternatively, routines may be executed in response to occurrence of a triggering event.
- Communication between controllers, and communication between controllers, actuators and/or sensors may be accomplished using a direct wired point-to-point link, a networked communication bus link, a wireless link or another suitable communication link.
- Communication includes exchanging data signals in suitable form, including, for example, electrical signals via a conductive medium, electromagnetic signals via air, optical signals via optical waveguides, and the like.
- the data signals may include discrete, analog or digitized analog signals representing inputs from sensors, actuator commands, and communication between controllers.
- signal refers to a physically discernible indicator that conveys information, and may be a suitable waveform (e.g., electrical, optical, magnetic, mechanical or electromagnetic), such as DC, AC, sinusoidal-wave, triangular-wave, square-wave, vibration, and the like, that is capable of traveling through a medium.
- suitable waveform e.g., electrical, optical, magnetic, mechanical or electromagnetic
- a parameter is defined as a measurable quantity that represents a physical property of a device or other element that is discernible using one or more sensors and/or a physical model.
- a parameter can have a discrete value, e.g., either “1” or “0”, or can be infinitely variable in value.
- FIG. 2 schematically shows an embodiment of the MEC routine 200 that includes showing a main thread of the application-layer software for MEC-based cooperative perception of a driving environment, wherein an example of the driving environment is analogous to the portion of the multi-lane highway 50 and IVHS 100 that are described with reference to FIG. 1 .
- the MEC routine 200 is primarily executed in the MEC cluster 10 that is described with reference to FIG. 1 .
- the teachings may be described herein in terms of functional and/or logical block components and/or various processing steps.
- the MEC routine 200 is illustrated as a collection of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, and/or firmware components that have been configured to perform the specified functions.
- the blocks represent computer instructions that, when executed by one or more processors, perform the recited operations.
- the steps of the MEC routine 200 may be executed in a suitable order, and are not limited to the order described with reference to FIG. 2 .
- the MEC routine 200 includes capturing a plurality of vehicle perception packets (VPP) ( 201 ), which are provided by one or more proximal connected vehicles that are operating in the driving environment and sent through an interface.
- VPP vehicle perception packets
- Each of the VPPs encapsulates the data as needed by the core algorithms.
- Each VPP encapsulates the following information, by way of non-limiting examples: depth (distance); visual descriptors e.g., an RGB or HSV color histogram, SURF, or another image feature vector; lane-level lateral position; speed estimation of each detected object; and geo-spatial positioning information of the observing vehicle together with a corresponding time stamp.
- the geo-spatial positioning may be provided by a satellite navigation system that provides autonomous geo-spatial positioning with global coverage, including GNSS (Global Navigation Satellite System), Global Positioning System (GPS), and other regional systems.
- GNSS Global Navigation Satellite System
- GPS Global Positioning System
- the VPPs which have been parsed, are pushed into a first buffer, referred to as a RecepBuffer ( 202 ), which is a temporary storage for the VPPs waiting to be processed.
- the VPPs are parsed by lower layers based on the signal received through the air interface of DSRC or C-V2X or some other wireless protocol for device-to-device communication.
- the RecepBuffer is used without being tied to a scheduled FPP transmission, and thus does not need to be periodically allocated and released.
- the RecepBuffer is monitored ( 204 ), which includes periodically querying the first buffer to find presence of a VPP, which is then taken out and conveyed it to a VPP pre-processing module ( 208 ) together with the current system time ( 206 ).
- the VPP pre-processing module ( 208 ) includes assigning a sub-thread of motion prediction for each of the VPPs that is conveyed by the previous module, with a target FPP transmission cycle (either current or next) being a sub-thread attribute.
- the FPP (“Fused Perception Packet”) is a data packet that is a fusion of the VPPs from the plurality of similarly-situated vehicles that are in communication with the RSU 20 .
- the FPP encapsulates positioning data of each of the plurality of similarly-situated vehicles in communication with the RSU 20 based on the fusion of the received VPPs corresponding to a specific temporal and spatial range, supplemented by the ID and positioning information of the particular RSU 20 .
- the FPP may include appropriate information for facilitating each of the connected vehicles 40 to identify which object described in the FPP refers to itself, i.e., to perform self-identification.
- the self-identification is based on an element of a Basic Safety Message (BSM) from each vehicle received by the RSU, namely the “DE TemporaryID” field included in the BSM; alternatively some hash value of the time-frequency resources used by each vehicle for the last BSM or VPP transmission can be utilized to serve this purpose.
- BSM Basic Safety Message
- the BSM is a message entity standardized by SAE (Society of Automotive Engineers) Standard J2735, which is intended to be broadcasted by individual vehicles through an air interface.
- a timing advance is determined as a target FPP transmission time plus t 3 minus the VPP time stamp for motion prediction based on the system time obtained ( 210 ).
- a motion prediction step ( 212 ) includes motion prediction followed by writing into a second buffer, referred to as a WaitBuffer.
- a motion-predicted VPP (MpVPP) is produced through the motion prediction algorithm by linearly extrapolating the positioning data of each object in the received VPP to a future time instant, as determined by one of the RSUs 20 based on the reception time of the corresponding VPP.
- a position interval is determined, and is represented by the front-most and rear-most object positions.
- the MpVPP is transferred to the second buffer, i.e., WaitBuffer, which is associated with the sub-thread attribute. Furthermore, an event flag indicating that a new MpVPP is being transferred to the WaitBuffer is conveyed.
- the motion prediction step includes linearly extrapolating the positioning data of each object in the received VPP to a future time instant, as determined by one of the RSUs 20 based on the reception time of VPP, which yields the corresponding motion-predicted VPP (MpVPP).
- the MpVPP, along with the position interval, the sub-thread attribute, and the event flag are conveyed to the WaitBuffer ( 214 ) and used to update a MpVPP conflict look-up table (MpVPPConfLUT) as it pertains to the specific sub-thread attribute (current or next) given the inputs from step 212 and/or step 234 ( 216 ).
- MpVPP conflict look-up table indicates whether there is potential overlap of objects or not between 2 MpVPPs, for the exhaustive pairs across the in-buffer MpVPPs and the in-matching-processing MpVPPs.
- the updated MpVPP conflict look-up table (MpVPPConfLUT) is captured ( 218 ) and subjected to conflict-free MpVPP subset extraction ( 220 ).
- the conflict-free MpVPP subset extraction ( 220 ) includes determining a conflict-free subset of in-buffer MpVPPs that is also in conflict-free status with the current in-matching-processing MpVPPs based on the MpVPPConfLUT, and removing the conflict-free subset of in-buffer MpVPPs from the WaitBuffer if it is non-empty.
- the conflict-free subset may be optimal or sub-optimal depending on the method used.
- a sub-thread of object matching is assigned ( 222 ).
- TbMatchSet A subset of the corresponding TransBuffer to which MpVPP is possible to be matched (TbMatchSet) is assigned according to MpVPP's position interval as a part of MpVPP pre-processing ( 224 ).
- the TbMatchSet and the sub-thread attribute are captured ( 226 ) and provided to an object matching routine ( 228 ), which is subject to a timing constraint.
- the object matching routine ( 228 ) identifies objects referring to the same vehicle across the MpVPPs from different ones of the connected vehicles 40 .
- the object matching routine ( 228 ) includes performing the object matching on multiple MpVPPs and TbMatchSet until the sub-thread serving the FPP transmission time instant T, i.e., with the attribute of “current” being terminated at time instant T ⁇ t 2 .
- TbMatchSet is empty, MpVPP is used as the fusion result directly.
- the object matching routine ( 228 ) adapts a maximal bipartite matching in the aspect that the edges produced by maximal matching are further pruned according to some upper threshold on the total matching score.
- the total matching score is calculated as the weighted harmonic mean of the matching scores pertaining to distance and visual descriptors, respectively.
- the sub-task of edge score calculation together with pruning can be parallelized with a sub-thread assigned for each vertex (object) on the side with less vertices, i.e. for the N-th vertex on the side with less vertices.
- N edges with the lowest matching scores are retained upon the completion of the calculations of its associated edge scores, and the vertices are traversed in serial for the optimal match to be determined.
- the sub-threads for edge score calculation together with pruning can be populated in the form of thread pools, which are managed by the timer thread in the same way as other sub-threads, and have the number of pools and pool sizes selected empirically.
- the timer thread is responsible for the allocation and release of buffers, change of sub-thread attributes, and also the termination of the sub-threads serving the current FPP transmission cycle according to the pre-defined timeline, as reflected by the time instants T ⁇ t 1 ⁇ t 2 , T ⁇ t 2 and T shown with reference to FIG. 3 .
- the object matching routine ( 228 ) performs timing-dependent tasks. If the sub-thread serving the FPP transmission time instant T has not reached T ⁇ t 2 , then the corresponding TransBuffer is updated based on the fusion result, and more fusion is performed, with the event flag employed to indicate that some MpVPP has just been fused into the corresponding TransBuffer ( 230 )( 232 ). Otherwise the sub-thread is terminated and the corresponding TransBuffer will remain unchanged ( 230 )( 236 ) and be immediately wrapped into FPP by lower layers ( 238 ), and the resultant FPP is output to the lower layers at the time instant T ⁇ t 2 ( 239 ).
- the resultant FPP is in the form of a GPS location, speed and trajectory of each of the plurality of similarly-situated vehicles that are in communication with the RSU 20 .
- FIG. 3 schematically shows a timeline 300 that indicates time-dependent tasks that need to be performed, with details being related to software architecture design in timing, thread and buffer management.
- a current FPP transmission cycle 310 and a next FPP transmission cycle 320 are indicated.
- An FPP transmission period is defined as AT.
- Other time periods include: t 1 311 , which is a time duration from a VPP being parsed by lower layers to this VPP being done for object matching; t 2 312 , which is a time duration for lower layers to wrap FPP and transmit it through PC5 interface; and t 3 313 , which is a time duration from a FPP being transmitted by RSU 20 to this FPP being applied on terminal applications run on vehicles.
- Time durations t 1 , t 2 and t 3 indicate latencies in the system.
- Timepoint 305 indicates T, which is the FPP transmission time instant, i.e., the end of the current transmission cycle 310 , and also a point at which the sub-thread attributes are changed from “next” to “current”.
- Timepoint 301 indicates T ⁇ T, which is an FPP transmission time instant of the previous transmission cycle, and timepoint 302 indicates T ⁇ T+t 3 .
- Timepoint 303 indicates T ⁇ t 1 ⁇ t 2 : time instant for the allocation of MpVPPConfLUT(T+ ⁇ T), WaitBuffer(T+ ⁇ T) and TransBuffer(T+ ⁇ T), and the beginning of assignment of sub-threads serving the FPP transmission time instant T+ ⁇ T (i.e.
- Timepoint 304 indicates T ⁇ t 2 : time instant for the finalization of TransBuffer(T) and its being wrapped into FPP(T), termination of the sub-threads serving the FPP transmission time instant T, and the release of MpVPPConfLUT(T), WaitBuffer(T) and TransBuffer(T).
- Timepoint 306 indicates T+t 3 (not paid attention to by the timer thread): the target time instant for motion prediction sub-threads serving the FPP transmission time instant T, timepoint 307 represents T+ ⁇ T ⁇ t 1 ⁇ t 2 , timepoint 308 represents T+ ⁇ T ⁇ t 2 , and timepoint 309 represents T+ ⁇ T, which indicates the FPP transmission time instant of the next transmission cycle.
- the timeline 300 includes assumptions on transmission periodicity and processing latency, in which the time duration from (T ⁇ T) to T is referred to as the “current” FPP transmission cycle. Transmission periodicity refer to the time instants T ⁇ T, T and T+ ⁇ T, which indicates the FPP transmission period of AT. Note that the time duration of t 1 is not confined to its position shown in FIG.
- This position actually indicates the deadline (T ⁇ t 1 ⁇ t 2 ) for the newly obtained VPPs to incur the sub-threads serving the FPP transmission time instant T, which is also the beginning for the newly obtained VPPs to incur the sub-threads serving the FPP transmission time instant T+ ⁇ T.
- the WaitBuffer, TransBuffer, and MpVPPConfLUT are used and associated with a specific current FPP transmission cycle or a next FPP transmission cycle, as reflected by the time instants T ⁇ t 1 ⁇ t 2 and T ⁇ t 2 in FIG. 3 pertaining to their allocation (for the T+ ⁇ T cycle) and release (for the T cycle).
- the conflict look-up tables MpVPPConfLUT may be evaluated in complete or non-complete form depending on the method used for evaluation. Parallelization of the object matching tasks, i.e., sub-threads, is enabled by a conflict look-up and avoidance mechanism.
- perception data packet formats intended for MEC-based perception-sharing incorporates design of perception data packet formats intended for MEC-based perception-sharing, core algorithms employing the packet formats that are tailored for use under the framework of V2x perception-sharing, and a software architecture design in timing, thread and buffer managements.
- core algorithms employing the packet formats that are tailored for use under the framework of V2x perception-sharing
- software architecture design in timing, thread and buffer managements.
- latency mitigation there are also other features to serve multiple objectives such as latency mitigation.
- Core algorithms making use of the packet formats and adapted for robustness related to sensing include direct sensing of vehicles and pedestrians via spatial monitoring system devices such as cameras, LIDAR and RADAR, direct sensing of vehicles from their V2X position reports, and indirect sensing via cloud-provided information.
- Core algorithms making use of the packet formats and adapted for robustness related to analysis include sensor fusion, traffic flow optimization, and vulnerable road user warnings.
- Core algorithms making use of the packet formats and adapted for robustness related to acting and other terminal applications include traffic light control, communicate, direct communication of signal phase and timing to approaching vehicles, broadband wireless hotspot connectivity (cellular and Wi-Fi), and ADAS operation.
- Embodiments in accordance with the present disclosure may be embodied as an apparatus, method, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may generally be referred to herein as a “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product embodied in a tangible medium of expression having computer-usable program code embodied in the medium.
- a computer-readable medium may include one or more of a portable computer diskette, a hard disk, a random access memory (RAM) device, a read-only memory (ROM) device, an erasable programmable read-only memory (EPROM or Flash memory) device, a portable compact disc read-only memory (CDROM), an optical storage device, and a magnetic storage device.
- Computer program code for carrying out operations of the present disclosure may be written in a combination of one or more programming languages.
- Embodiments may also be implemented in cloud computing environments.
- cloud computing may be defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly.
- configurable computing resources e.g., networks, servers, storage, applications, and services
- a cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (“SaaS”), Platform as a Service (“PaaS”), Infrastructure as a Service (“IaaS”), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.).
- service models e.g., Software as a Service (“SaaS”), Platform as a Service (“PaaS”), Infrastructure as a Service (“IaaS”)
- deployment models e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations may be implemented by dedicated-function hardware-based systems that perform the specified functions or acts, or combinations of dedicated-function hardware and computer instructions.
- These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction set that implements the function/act specified in the flowchart and/or block diagram block or blocks.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computing Systems (AREA)
- Entrepreneurship & Innovation (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Multimedia (AREA)
- Primary Health Care (AREA)
- Development Economics (AREA)
- Game Theory and Decision Science (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- This application claims the benefit of U.S. patent application Ser. No. 16/542,780 filed on Aug. 16, 2019, the disclosure of which is hereby incorporated by reference.
- There is a need to robustly monitor high density real-time traffic situations to help smooth traffic flow, improve response of emergency vehicles, and provide additional services. There is a further need for vehicles equipped with advanced driver assistance systems to robustly monitor high density real-time traffic situations.
- A systematic solution is provided for perception-sharing through vehicle-to-everything (V2X) communication, which supports a wide range of applications, thereby facilitating the implementation of related products with the use of standardized application program interfaces (APIs). Elements of this solution include a software architecture for minimizing processing latency, together with routines that combine use of distance and visual descriptors to identify overlap between perception results captured from several vehicles. The concepts enable blind area monitoring, detection, and mitigation for individual vehicles and other applications based on sensory information that is obtained over V2X communication, while mitigating processing latency and ensuring robustness in a high-density real-time traffic situation.
- An aspect of the disclosure includes a computer-implemented method for perception-sharing between a plurality of similarly-situated vehicles that are traveling on a portion of a roadway that is equipped with an intelligent vehicle highway system. This includes executing, in a multi-access edge computing cluster in communication with a roadside unit disposed to monitor a portion of a roadway, an application-layer routine. The application-layer routine includes collecting real-time data associated with a plurality of objects from each of the plurality of similarly-situated vehicles traveling on the portion of the roadway, predicting motion of each of the plurality of objects based upon the real-time data from the plurality of similarly-situated vehicles, object-matching the motion of each of the plurality of objects, wherein the object-matching is subjected to a time constraint, and executing fusion of the plurality of objects based upon the object-matching of the motion of each of the plurality of objects. Locations of the similarly-situated vehicles traveling on the portion of the roadway are identified based upon the fusion of the plurality of objects. The locations of the similarly-situated vehicles traveling on the portion of the roadway are communicated to one of the similarly-situated vehicles.
- Another aspect of the disclosure includes collecting real-time data related to distance, a visual descriptor, a lane-level lateral position, and a speed estimation of each of the plurality of objects, a corresponding time stamp, and a geo-spatial positioning of the respective one of the plurality of similarly-situated vehicles.
- Another aspect of the disclosure includes linearly extrapolating the positioning data of each of the plurality of objects to predict motion of each of the plurality of objects based upon the real-time data from the plurality of similarly-situated vehicles.
- Another aspect of the disclosure includes executing blind area monitoring by at least one of the plurality of similarly-situated vehicles based upon the fusion of the plurality of objects.
- Another aspect of the disclosure includes the similarly-situated vehicles being a plurality of vehicles that are travelling in the same direction on the portion of the roadway.
- Another aspect of the disclosure includes determining a location, speed and trajectory of each of the plurality of similarly-situated vehicles.
- Another aspect of the disclosure includes an application-layer software routine in communication with a roadside unit that is arranged to monitor a portion of a roadway. The software routine includes an instruction set that is executable to collect real-time data associated with a plurality of objects from each of a plurality of similarly-situated vehicles traveling on the portion of the roadway, predict motion of each of the plurality of objects based upon the real-time data from the plurality of similarly-situated vehicles, object-match the motion of each of the plurality of objects, wherein the object-matching is subjected to a time constraint, and fuse the plurality of objects based upon the object-matching of the motion of each of the plurality of objects. Locations of the plurality of similarly-situated vehicles traveling on the portion of the roadway are identified based upon the fused plurality of objects and communicated, via the roadside unit, to one of the similarly-situated vehicles.
- Another aspect of the disclosure includes the application-layer software routine being arranged to be executed by a multi-edge computing cluster.
- Another aspect of the disclosure includes communicating, via the roadside unit, the locations of the similarly-situated vehicles traveling on the portion of the roadway to one of the similarly-situated vehicles, wherein the one of the similarly-situated vehicles includes an advanced driver-assistance system (ADAS).
- Another aspect of the disclosure includes controlling the one of the similarly-situated vehicles that includes an advanced driver-assistance system based upon the locations of the similarly-situated vehicles traveling on the portion of the roadway.
- The above features and advantages, and other features and advantages, of the present teachings are readily apparent from the following detailed description of some of the best modes and other embodiments for carrying out the present teachings, as defined in the appended claims, when taken in connection with the accompanying drawings.
- One or more embodiments will now be described, by way of example, with reference to the accompanying drawings, in which:
-
FIG. 1 schematically illustrates a plurality of vehicles travelling on a multi-lane highway that is equipped with an embodiment of an intelligent vehicle/highway system (IVHS), in accordance with the disclosure. -
FIG. 2 schematically illustrates, in flowchart form, a Multi-Access Edge Computing (MEC)-based perception-sharing routine for cooperative perception of a driving environment, in accordance with the disclosure. -
FIG. 3 schematically illustrates a timeline that indicates time-dependent tasks that are performed as part of an object matching routine, in accordance with the disclosure. - The appended drawings are not necessarily to scale, and may present a somewhat simplified representation of various preferred features of the present disclosure as disclosed herein, including, for example, specific dimensions, orientations, locations, and shapes. Details associated with such features will be determined in part by the particular intended application and use environment.
- The components of the disclosed embodiments, as described and illustrated herein, may be arranged and designed in a variety of different configurations. Thus, the following detailed description is not intended to limit the scope of the disclosure, as claimed, but is merely representative of possible embodiments thereof. In addition, while numerous specific details are set forth in the following description in order to provide a thorough understanding of the embodiments disclosed herein, some embodiments can be practiced without some of these details. Moreover, for the purpose of clarity, certain technical material that is understood in the related art has not been described in detail in order to avoid unnecessarily obscuring the disclosure. Furthermore, the drawings are in simplified form and are not to precise scale. For purposes of convenience and clarity only, directional terms such as left, right, front, rear, etc., may be used with respect to the drawings. These and similar directional terms are not to be construed to limit the scope of the disclosure. Furthermore, the disclosure, as illustrated and described herein, may be practiced in the absence of an element that is not specifically disclosed herein.
- Referring to the drawings, wherein like reference numerals correspond to like or similar components throughout the several Figures,
FIG. 1 , consistent with embodiments disclosed herein, schematically illustrates a plurality ofvehicles 30 that are travelling on a portion of amulti-lane highway 50 that is equipped with an embodiment of an intelligent vehicle/highway system (IVHS) 100 that includes a Multi-Access Edge Computing (MEC)cluster 10 and a plurality of road-side units (RSUs) 20. TheMEC cluster 10 includes a Multi-Access Edge Computing (MEC) perception-sharing routine (routine) 200, as described with reference toFIGS. 2 and 3 . - Each of the
vehicles 30 may include, but not be limited to a mobile platform in the form of a commercial vehicle, industrial vehicle, agricultural vehicle, passenger vehicle, aircraft, watercraft, train, all-terrain vehicle, personal movement apparatus, robot and the like to accomplish the purposes of this disclosure. A subset of thevehicles 30 may be connected vehicles 40. Connected vehicles 40 are equipped with aspatial monitoring system 44 and atelematics communication system 42 that is capable of wireless extra-vehicle communications. One or more of the connected vehicles 40 may include an advanced driver-assistance system (ADAS) 46. - The
telematics communication system 42 is capable of extra-vehicle communications, including communicating with a communication network system that may include wireless and wired communication capabilities. Thetelematics communication system 42 includes a telematics controller that is capable of extra-vehicle communications that includes vehicle-to-everything (V2X) communication. The V2X communication includes short-range vehicle-to-vehicle (V2V) communication, and communication to one or more of theRSUs 20, thus facilitating localized communication between a plurality of similarly-situated vehicles that are moving parts of the IVHS 100. Alternatively or in addition, thetelematics communication system 42 is capable of short-range wireless communication to a handheld device, e.g., a cell phone, a satellite phone or another telephonic device. In one embodiment the handheld device is loaded with a software application that includes a wireless protocol to communicate with the telematics controller, and the handheld device executes the extra-vehicle communication, including communicating with an off-board controller via a communication network. Alternatively or in addition, the telematics controller executes the extra-vehicle communication directly by communicating with the off-board controller via the communication network. - The
spatial monitoring system 44 includes a plurality of spatial sensors that are in communication with a spatial monitoring controller, wherein each of the spatial sensors is disposed on-vehicle to monitor a field-of-view surrounding the connected vehicle 40, includingother vehicles 30 that are proximal to the connected vehicle 40. The spatial monitoring controller generates digital representations of each of the fields of view including theproximal vehicles 30 based upon data inputs from the spatial sensors. The spatial monitoring controller can evaluate inputs from the spatial sensors to determine a linear range, relative speed, and trajectory of the connected vehicle 40 in view of each of theproximal vehicles 30. The spatial sensors can be located at various locations on the connected vehicle 40, including the front corners, rear corners, rear sides and mid-sides. The spatial sensors can include a front radar sensor and a camera in one embodiment, although the disclosure is not so limited. Placement of the aforementioned spatial sensors permits the spatial monitoring controller to monitor traffic flow including proximate vehicles and other objects in the vicinity of the connected vehicle 40. The spatial sensors of thespatial monitoring system 44 can further include object-locating sensing devices including range sensors, such as FM-CW (Frequency Modulated Continuous Wave) radars, pulse and FSK (Frequency Shift Keying) radars, and Lidar (Light Detection and Ranging) devices, and ultrasonic devices which rely upon effects such as Doppler-effect measurements to locate forward objects. The possible object-locating devices include charged-coupled devices (CCD) or complementary metal oxide semi-conductor (CMOS) video image sensors, and other camera/video image processors which utilize digital photographic methods to ‘view’ forward objects including one or more vehicle(s). - Each of the connected vehicles 40 includes a propulsion system, a wheel braking system, and a steering system. In one embodiment, the operation of the propulsion system, the wheel braking system, and the steering system may be controlled by direct interaction with a vehicle operator alone, or in combination with the ADAS 46, employing inputs from the
spatial monitoring system 44. - The
ADAS 46 is arranged to provide operator assistance features by controlling one or more of the propulsion system, the wheel braking system, and the steering system with little or no direct interaction of the vehicle operator. TheADAS 46 includes a controller and one or a plurality of subsystems that provide operator assistance features, including one or more of an adaptive cruise control (ACC) system, a lane-keeping control (LKY) system, a lane change control (LCC) system, an autonomous braking/collision avoidance system, and/or other systems that are configured to command and control autonomous vehicle operation separate from or in conjunction with operator requests. TheADAS 46 may interact with and access information from an on-board map database for route planning and to control operation of the connected vehicle 40 via the lane-keeping system, the lane-centering system, and/or other systems that are configured to command and control autonomous vehicle operation. Autonomous operating commands may be generated to control the ACC system, the LKY system, the LCC system, the autonomous braking/collision avoidance system, and/or the other systems. Vehicle operation includes operation in a propulsion mode in response to desired commands, which can include operator requests and/or autonomous vehicle requests. Vehicle operation, including autonomous vehicle operation includes acceleration, braking, steering, steady-state running, coasting, and idling. Operator requests can be generated based upon operator inputs to an accelerator pedal, a brake pedal, a steering wheel, a transmission range selector, the ACC system, etc. - As can be appreciated, each of the connected vehicles 40 traveling on the portion of the
multi-lane highway 50 may be capable of detecting one or more of theother vehicles 30 that are proximal thereto employing inputs from thespatial monitoring system 44. However, some of thevehicles 30 proximal thereto may be undetectable due to masking caused by other, interveningvehicles 30. Masking by interveningvehicles 30 causes the blind areas that cannot be perceived by one or more of the connected vehicles. - The
IVHS 100 includes theMEC cluster 10, which may be remotely-located and is in communication with the plurality of road-side units (RSUs) 20, and can be configured to monitor locations, speeds and trajectories of a plurality ofvehicles 30, including a plurality of similarly-situated ones of thevehicles 30 that are travelling on the portion of themulti-lane highway 50. Similarly-situated vehicles are thosevehicles 30 that are travelling in the same direction on the same portion of themulti-lane highway 50. In one embodiment, the same portion of themulti-lane highway 50 includes a portion of themulti-lane highway 50 that is within communication range of one of theRSUs 20. - The
MEC cluster 10 includes a cloud-based IT (information technology) service environment located at the edge of a network. The purpose of edge computing and theMEC cluster 10 is to bring real-time, high-bandwidth, low-latency access to latency-dependent applications that are distributed at the edge of the network. Since edge computing is closer to the end user and apps, it allows for localized and cloud-based applications. Edge computing reduces network congestion and improves application performance by executing related task processing closer to the end user, i.e., the connected vehicle 40, improving the delivery of content and applications to those users. TheMEC cluster 10 moves the computing of traffic and services from a centralized cloud to the edge of the network and closer to the connected vehicle 40, and the network edge analyzes, processes, and stores the data. This serves to reduce communication and processing latency. Characteristics of theMEC cluster 10 and road-side units (RSUs) 20 include proximity, ultra-low latency, high bandwidth, and virtualization. When deployed on-vehicle, the connected vehicle 40 is able to constantly sense driving patterns, road conditions and other vehicle movements to provide guidance to the vehicle operator and theADAS 46. Most of the predictive and prescriptive insights need to be provided in a timely manner, which means that data from thespatial monitoring system 44 needs to be collected, processed and analyzed by theMEC cluster 10 to provide low latency insights to the vehicle operator and theADAS 46. - The
MEC cluster 10 includes an application-layer software architecture and algorithm design that enables efficient processing for cooperative perception of the driving environment on the portion of themulti-lane highway 50, with the original perception results provided by individual ones of the connected vehicles 40, and the fused perception results provided by theRSUs 20 following the fusion tasks performed byMEC cluster 10. The core tasks include identifying overlap between perception results from a plurality of similarly-situated vehicles, mitigating processing latency, and providing data usable by a wide range of applications to a plurality of functionality modules. Applications may be related to traffic monitoring, detection of congestions, detection of emergency vehicles and roadside assistance vehicles, access and parking system operation, enforcement systems, multi-lane and single lane free flow systems, etc. - The
MEC cluster 10 provides a systematic solution for perception-sharing among connected vehicles and infrastructure through multi-access edge computing (MEC), including an architecture and algorithm design of software modules for fusion tasks to be performed efficiently. TheMEC cluster 10 can be used to support a wide range of applications carried by individual vehicles or infrastructure sites, with negligible errors in real-time. This enables blind area monitoring, detection, and mitigation for individual vehicles and other applications based on sensory information that is obtained over V2X, with effective approach for latency mitigation and ensuring robustness against high populated real-time traffic situations. - The road-side units (RSUs) 20 are transceivers configured for Dedicated Short-Range Communications (DSRC) that may be mounted along a road or a pedestrian passageway.
RSUs 20 communicate using short-range, low-power data transmissions of limited duration. The main function of each of theRSUs 20 is to facilitate the communication between vehicles, transportation infrastructure, and other devices by transferring data over DSRC in accordance with industry standards, e.g., SAE Standard J2735 (SAE J2735—Dedicated Short Range Communications (DSRC) Message Set Dictionary). TheRSUs 20 are integrated into and communicate with theMEC cluster 10. Each of theRSUs 20 may be in communication with a traffic monitoring fixture (not shown), such a roadside camera or another device. EachRSU 20 broadcasts data to or exchanges data with connected vehicles 40 that are disposed within its communication zone and provides channel assignments and operating instructions to it. The connected vehicles 40 receive, contend for time to transmit, or are assigned a time to transmit on one or more radio frequency channels. - A systematic solution is provided for vehicle perception-sharing through V2X, based on which a wide range of applications can be supported, thereby facilitating the implementation of related products with the use of common (standardized) APIs. As appreciated, an API is a set of routines, protocols, and tools for building software applications, including core algorithms as described herein. Basically, an API specifies how software components should interact. Additionally, APIs are used when programming graphical user interface (GUI) components.
- A software architecture is developed for mitigating the processing latency, together with the algorithms making combined use of distance and visual descriptors for identifying the overlapping between the perception results from different ones of the connected vehicles 40. This includes features that are intended for forward-seeing objectives pertaining to vehicle perception-sharing over V2X, and include problem formulation together with the corresponding functionality module design for the objective of providing common APIs that can be shared by a wide range of V2X-based applications and software architecture with the design of conflict-aware task parallelization. This results in awareness that is conveyed from an event-triggered conflict look-up mechanism to achieve latency mitigation, efficient computing resource utilization, and scalability. The concepts described herein facilitate connected intelligent driving (CID), in the framework of V2X perception-sharing, together with payload functionality modules, benefitting from common APIs, and V2X perception-sharing algorithms driven by the principle of efficient utilization of computing resources, and also the consideration for the need of scalability with real-time traffic flow as well as future wireless bandwidth expansion. The software architecture is tailored with a feature of parallelizing multiple tasks which potentially have conflicts, while the conflicts can be effectively avoided by making use of an event-triggered conflict look-up mechanism.
- The term “controller” and related terms such as microcontroller, control module, module, control, control unit, processor and similar terms refer to one or various combinations of Application Specific Integrated Circuit(s) (ASIC), Field-Programmable Gate Array (FPGA), electronic circuit(s), central processing unit(s), e.g., microprocessor(s) and associated non-transitory memory component(s) in the form of memory and storage devices (read only, programmable read only, random access, hard drive, etc.). The non-transitory memory component is capable of storing machine readable instructions in the form of one or more software or firmware programs or routines, combinational logic circuit(s), input/output circuit(s) and devices, signal conditioning and buffer circuitry and other components that can be accessed by one or more processors to provide a described functionality. Input/output circuit(s) and devices include analog/digital converters and related devices that monitor inputs from sensors, with such inputs monitored at a preset sampling frequency or in response to a triggering event. Software, firmware, programs, instructions, control routines, code, algorithms and similar terms mean controller-executable instruction sets including calibrations and look-up tables. Each controller executes control routine(s) to provide desired functions. Routines may be executed at regular intervals, for example each 100 microseconds during ongoing operation. Alternatively, routines may be executed in response to occurrence of a triggering event. Communication between controllers, and communication between controllers, actuators and/or sensors may be accomplished using a direct wired point-to-point link, a networked communication bus link, a wireless link or another suitable communication link. Communication includes exchanging data signals in suitable form, including, for example, electrical signals via a conductive medium, electromagnetic signals via air, optical signals via optical waveguides, and the like. The data signals may include discrete, analog or digitized analog signals representing inputs from sensors, actuator commands, and communication between controllers.
- The term “signal” refers to a physically discernible indicator that conveys information, and may be a suitable waveform (e.g., electrical, optical, magnetic, mechanical or electromagnetic), such as DC, AC, sinusoidal-wave, triangular-wave, square-wave, vibration, and the like, that is capable of traveling through a medium.
- A parameter is defined as a measurable quantity that represents a physical property of a device or other element that is discernible using one or more sensors and/or a physical model. A parameter can have a discrete value, e.g., either “1” or “0”, or can be infinitely variable in value.
-
FIG. 2 schematically shows an embodiment of the MEC routine 200 that includes showing a main thread of the application-layer software for MEC-based cooperative perception of a driving environment, wherein an example of the driving environment is analogous to the portion of themulti-lane highway 50 andIVHS 100 that are described with reference toFIG. 1 . The MEC routine 200 is primarily executed in theMEC cluster 10 that is described with reference toFIG. 1 . The teachings may be described herein in terms of functional and/or logical block components and/or various processing steps. The MEC routine 200 is illustrated as a collection of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, and/or firmware components that have been configured to perform the specified functions. In the context of software, the blocks represent computer instructions that, when executed by one or more processors, perform the recited operations. The steps of the MEC routine 200 may be executed in a suitable order, and are not limited to the order described with reference toFIG. 2 . - The MEC routine 200 includes capturing a plurality of vehicle perception packets (VPP) (201), which are provided by one or more proximal connected vehicles that are operating in the driving environment and sent through an interface. Each of the VPPs encapsulates the data as needed by the core algorithms. Each VPP encapsulates the following information, by way of non-limiting examples: depth (distance); visual descriptors e.g., an RGB or HSV color histogram, SURF, or another image feature vector; lane-level lateral position; speed estimation of each detected object; and geo-spatial positioning information of the observing vehicle together with a corresponding time stamp. The geo-spatial positioning may be provided by a satellite navigation system that provides autonomous geo-spatial positioning with global coverage, including GNSS (Global Navigation Satellite System), Global Positioning System (GPS), and other regional systems.
- The VPPs, which have been parsed, are pushed into a first buffer, referred to as a RecepBuffer (202), which is a temporary storage for the VPPs waiting to be processed. The VPPs are parsed by lower layers based on the signal received through the air interface of DSRC or C-V2X or some other wireless protocol for device-to-device communication. The RecepBuffer is used without being tied to a scheduled FPP transmission, and thus does not need to be periodically allocated and released. The RecepBuffer is monitored (204), which includes periodically querying the first buffer to find presence of a VPP, which is then taken out and conveyed it to a VPP pre-processing module (208) together with the current system time (206). The VPP pre-processing module (208) includes assigning a sub-thread of motion prediction for each of the VPPs that is conveyed by the previous module, with a target FPP transmission cycle (either current or next) being a sub-thread attribute.
- The FPP (“Fused Perception Packet”) is a data packet that is a fusion of the VPPs from the plurality of similarly-situated vehicles that are in communication with the
RSU 20. The FPP encapsulates positioning data of each of the plurality of similarly-situated vehicles in communication with theRSU 20 based on the fusion of the received VPPs corresponding to a specific temporal and spatial range, supplemented by the ID and positioning information of theparticular RSU 20. The FPP may include appropriate information for facilitating each of the connected vehicles 40 to identify which object described in the FPP refers to itself, i.e., to perform self-identification. The self-identification is based on an element of a Basic Safety Message (BSM) from each vehicle received by the RSU, namely the “DE TemporaryID” field included in the BSM; alternatively some hash value of the time-frequency resources used by each vehicle for the last BSM or VPP transmission can be utilized to serve this purpose. The BSM is a message entity standardized by SAE (Society of Automotive Engineers) Standard J2735, which is intended to be broadcasted by individual vehicles through an air interface. A timing advance is determined as a target FPP transmission time plus t3 minus the VPP time stamp for motion prediction based on the system time obtained (210). - A motion prediction step (212) includes motion prediction followed by writing into a second buffer, referred to as a WaitBuffer. A motion-predicted VPP (MpVPP) is produced through the motion prediction algorithm by linearly extrapolating the positioning data of each object in the received VPP to a future time instant, as determined by one of the
RSUs 20 based on the reception time of the corresponding VPP. A position interval is determined, and is represented by the front-most and rear-most object positions. The MpVPP is transferred to the second buffer, i.e., WaitBuffer, which is associated with the sub-thread attribute. Furthermore, an event flag indicating that a new MpVPP is being transferred to the WaitBuffer is conveyed. The motion prediction step includes linearly extrapolating the positioning data of each object in the received VPP to a future time instant, as determined by one of theRSUs 20 based on the reception time of VPP, which yields the corresponding motion-predicted VPP (MpVPP). - The MpVPP, along with the position interval, the sub-thread attribute, and the event flag are conveyed to the WaitBuffer (214) and used to update a MpVPP conflict look-up table (MpVPPConfLUT) as it pertains to the specific sub-thread attribute (current or next) given the inputs from
step 212 and/or step 234 (216). The MpVPP conflict look-up table indicates whether there is potential overlap of objects or not between 2 MpVPPs, for the exhaustive pairs across the in-buffer MpVPPs and the in-matching-processing MpVPPs. - The updated MpVPP conflict look-up table (MpVPPConfLUT) is captured (218) and subjected to conflict-free MpVPP subset extraction (220).
- The conflict-free MpVPP subset extraction (220) includes determining a conflict-free subset of in-buffer MpVPPs that is also in conflict-free status with the current in-matching-processing MpVPPs based on the MpVPPConfLUT, and removing the conflict-free subset of in-buffer MpVPPs from the WaitBuffer if it is non-empty. The conflict-free subset may be optimal or sub-optimal depending on the method used.
- For each MpVPP conveyed by the previous module, a sub-thread of object matching is assigned (222).
- A subset of the corresponding TransBuffer to which MpVPP is possible to be matched (TbMatchSet) is assigned according to MpVPP's position interval as a part of MpVPP pre-processing (224). The TbMatchSet and the sub-thread attribute are captured (226) and provided to an object matching routine (228), which is subject to a timing constraint.
- The object matching routine (228) identifies objects referring to the same vehicle across the MpVPPs from different ones of the connected vehicles 40. The object matching routine (228) includes performing the object matching on multiple MpVPPs and TbMatchSet until the sub-thread serving the FPP transmission time instant T, i.e., with the attribute of “current” being terminated at time instant T−t2. When TbMatchSet is empty, MpVPP is used as the fusion result directly. Based on practical considerations, the object matching routine (228) adapts a maximal bipartite matching in the aspect that the edges produced by maximal matching are further pruned according to some upper threshold on the total matching score. The total matching score is calculated as the weighted harmonic mean of the matching scores pertaining to distance and visual descriptors, respectively.
- For the object matching routine (228), the sub-task of edge score calculation together with pruning can be parallelized with a sub-thread assigned for each vertex (object) on the side with less vertices, i.e. for the N-th vertex on the side with less vertices. With such vertices processed in parallel, N edges with the lowest matching scores are retained upon the completion of the calculations of its associated edge scores, and the vertices are traversed in serial for the optimal match to be determined. In addition, to reduce the cost of frequent allocation and release, the sub-threads for edge score calculation together with pruning can be populated in the form of thread pools, which are managed by the timer thread in the same way as other sub-threads, and have the number of pools and pool sizes selected empirically. The timer thread is responsible for the allocation and release of buffers, change of sub-thread attributes, and also the termination of the sub-threads serving the current FPP transmission cycle according to the pre-defined timeline, as reflected by the time instants T−t1−t2, T−t2 and T shown with reference to
FIG. 3 . - Referring again to
FIG. 2 , with continued reference toFIG. 3 , the object matching routine (228) performs timing-dependent tasks. If the sub-thread serving the FPP transmission time instant T has not reached T−t2, then the corresponding TransBuffer is updated based on the fusion result, and more fusion is performed, with the event flag employed to indicate that some MpVPP has just been fused into the corresponding TransBuffer (230)(232). Otherwise the sub-thread is terminated and the corresponding TransBuffer will remain unchanged (230)(236) and be immediately wrapped into FPP by lower layers (238), and the resultant FPP is output to the lower layers at the time instant T−t2 (239). The resultant FPP is in the form of a GPS location, speed and trajectory of each of the plurality of similarly-situated vehicles that are in communication with theRSU 20. -
FIG. 3 schematically shows atimeline 300 that indicates time-dependent tasks that need to be performed, with details being related to software architecture design in timing, thread and buffer management. A currentFPP transmission cycle 310 and a nextFPP transmission cycle 320 are indicated. An FPP transmission period is defined as AT. Other time periods include:t1 311, which is a time duration from a VPP being parsed by lower layers to this VPP being done for object matching;t2 312, which is a time duration for lower layers to wrap FPP and transmit it through PC5 interface; andt3 313, which is a time duration from a FPP being transmitted byRSU 20 to this FPP being applied on terminal applications run on vehicles. Time durations t1, t2 and t3 indicate latencies in the system. - Timepoint 305 indicates T, which is the FPP transmission time instant, i.e., the end of the
current transmission cycle 310, and also a point at which the sub-thread attributes are changed from “next” to “current”.Timepoint 301 indicates T−ΔT, which is an FPP transmission time instant of the previous transmission cycle, andtimepoint 302 indicates T−ΔT+t3.Timepoint 303 indicates T−t1−t2: time instant for the allocation of MpVPPConfLUT(T+ΔT), WaitBuffer(T+ΔT) and TransBuffer(T+ΔT), and the beginning of assignment of sub-threads serving the FPP transmission time instant T+ΔT (i.e. having the sub-thread attribute of “next”) if incurred by the VPPs obtained afterwards. Timepoint 304 indicates T−t2: time instant for the finalization of TransBuffer(T) and its being wrapped into FPP(T), termination of the sub-threads serving the FPP transmission time instant T, and the release of MpVPPConfLUT(T), WaitBuffer(T) and TransBuffer(T). Timepoint 306 indicates T+t3 (not paid attention to by the timer thread): the target time instant for motion prediction sub-threads serving the FPP transmission time instant T,timepoint 307 represents T+ΔT−t1−t2,timepoint 308 represents T+ΔT−t2, andtimepoint 309 represents T+ΔT, which indicates the FPP transmission time instant of the next transmission cycle. Thetimeline 300 includes assumptions on transmission periodicity and processing latency, in which the time duration from (T−ΔT) to T is referred to as the “current” FPP transmission cycle. Transmission periodicity refer to the time instants T−ΔT, T and T+ΔT, which indicates the FPP transmission period of AT. Note that the time duration of t1 is not confined to its position shown inFIG. 3 . This position actually indicates the deadline (T−t1−t2) for the newly obtained VPPs to incur the sub-threads serving the FPP transmission time instant T, which is also the beginning for the newly obtained VPPs to incur the sub-threads serving the FPP transmission time instant T+ΔT. - The WaitBuffer, TransBuffer, and MpVPPConfLUT, with one complete life span of them indicated by
period 314, are used and associated with a specific current FPP transmission cycle or a next FPP transmission cycle, as reflected by the time instants T−t1−t2 and T−t2 inFIG. 3 pertaining to their allocation (for the T+ΔT cycle) and release (for the T cycle). The conflict look-up tables MpVPPConfLUT may be evaluated in complete or non-complete form depending on the method used for evaluation. Parallelization of the object matching tasks, i.e., sub-threads, is enabled by a conflict look-up and avoidance mechanism. - The concepts described herein provide a methodology that facilitates cooperative perception of a driving environment among similarly-situated connected vehicles 40 and
RSUs 20, intended for supporting a wide range of applications, e.g., enablement of blind area monitoring, detection, and mitigation for individual vehicles and other applications based on sensory information that is obtained over V2X. Mitigation for individual vehicles and other applications based on sensory information that is obtained over V2X may include controlling, via theADAS 46, controlling one or more of the propulsion system, the wheel braking system, and the steering system with little or no direct interaction of the vehicle operator in response to the cooperative perception of the driving environment. It incorporates design of perception data packet formats intended for MEC-based perception-sharing, core algorithms employing the packet formats that are tailored for use under the framework of V2x perception-sharing, and a software architecture design in timing, thread and buffer managements. In addition to the conflict-aware task parallelization, there are also other features to serve multiple objectives such as latency mitigation. - Core algorithms making use of the packet formats and adapted for robustness related to sensing include direct sensing of vehicles and pedestrians via spatial monitoring system devices such as cameras, LIDAR and RADAR, direct sensing of vehicles from their V2X position reports, and indirect sensing via cloud-provided information. Core algorithms making use of the packet formats and adapted for robustness related to analysis include sensor fusion, traffic flow optimization, and vulnerable road user warnings. Core algorithms making use of the packet formats and adapted for robustness related to acting and other terminal applications include traffic light control, communicate, direct communication of signal phase and timing to approaching vehicles, broadband wireless hotspot connectivity (cellular and Wi-Fi), and ADAS operation.
- Embodiments in accordance with the present disclosure may be embodied as an apparatus, method, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may generally be referred to herein as a “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product embodied in a tangible medium of expression having computer-usable program code embodied in the medium.
- Various combinations of one or more computer-usable or computer-readable media may be utilized. For example, a computer-readable medium may include one or more of a portable computer diskette, a hard disk, a random access memory (RAM) device, a read-only memory (ROM) device, an erasable programmable read-only memory (EPROM or Flash memory) device, a portable compact disc read-only memory (CDROM), an optical storage device, and a magnetic storage device. Computer program code for carrying out operations of the present disclosure may be written in a combination of one or more programming languages.
- Embodiments may also be implemented in cloud computing environments. In this description and the following claims, “cloud computing” may be defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly. A cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (“SaaS”), Platform as a Service (“PaaS”), Infrastructure as a Service (“IaaS”), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.).
- The flowchart and block diagrams in the flow diagrams illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, may be implemented by dedicated-function hardware-based systems that perform the specified functions or acts, or combinations of dedicated-function hardware and computer instructions. These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction set that implements the function/act specified in the flowchart and/or block diagram block or blocks.
- The detailed description and the drawings or figures are supportive and descriptive of the present teachings, but the scope of the present teachings is defined solely by the claims. While some of the best modes and other embodiments for carrying out the present teachings have been described in detail, various alternative designs and embodiments exist for practicing the present teachings defined in the appended claims.
Claims (16)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/751,804 US20210049903A1 (en) | 2019-08-16 | 2020-01-24 | Method and apparatus for perception-sharing between vehicles |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/542,780 US11574538B2 (en) | 2019-08-16 | 2019-08-16 | Method and apparatus for perception-sharing between vehicles |
US16/751,804 US20210049903A1 (en) | 2019-08-16 | 2020-01-24 | Method and apparatus for perception-sharing between vehicles |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/542,780 Continuation US11574538B2 (en) | 2019-08-16 | 2019-08-16 | Method and apparatus for perception-sharing between vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210049903A1 true US20210049903A1 (en) | 2021-02-18 |
Family
ID=74566863
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/542,780 Active 2040-01-17 US11574538B2 (en) | 2019-08-16 | 2019-08-16 | Method and apparatus for perception-sharing between vehicles |
US16/751,804 Abandoned US20210049903A1 (en) | 2019-08-16 | 2020-01-24 | Method and apparatus for perception-sharing between vehicles |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/542,780 Active 2040-01-17 US11574538B2 (en) | 2019-08-16 | 2019-08-16 | Method and apparatus for perception-sharing between vehicles |
Country Status (2)
Country | Link |
---|---|
US (2) | US11574538B2 (en) |
CN (1) | CN112396828B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220018677A1 (en) * | 2020-07-20 | 2022-01-20 | At&T Intellectual Property I, L.P. | Facilitation of predictive simulation of planned environment |
US11611448B2 (en) | 2020-06-26 | 2023-03-21 | At&T Intellectual Property I, L.P. | Facilitation of predictive assisted access to content |
US20230095194A1 (en) * | 2021-09-30 | 2023-03-30 | AyDeeKay LLC dba Indie Semiconductor | Dynamic and Selective Pairing Between Proximate Vehicles |
US11902134B2 (en) | 2020-07-17 | 2024-02-13 | At&T Intellectual Property I, L.P. | Adaptive resource allocation to facilitate device mobility and management of uncertainty in communications |
US11956841B2 (en) | 2020-06-16 | 2024-04-09 | At&T Intellectual Property I, L.P. | Facilitation of prioritization of accessibility of media |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022177631A (en) * | 2021-05-18 | 2022-12-01 | 株式会社日立製作所 | control system |
CN113259900B (en) * | 2021-05-27 | 2021-10-15 | 华砺智行(武汉)科技有限公司 | Distributed multi-source heterogeneous traffic data fusion method and device |
CN113490177B (en) * | 2021-06-16 | 2024-02-13 | 北京邮电大学 | Vehicle networking high-efficiency communication method based on cloud wireless access network architecture and related equipment |
US20230166759A1 (en) * | 2021-12-01 | 2023-06-01 | Toyota Research Institute, Inc. | Systems and methods for improving localization accuracy by sharing dynamic object localization information |
CN114339678A (en) * | 2022-01-06 | 2022-04-12 | 高新兴智联科技有限公司 | Vehicle driving assisting communication method and communication system based on V2X |
CN114648870B (en) * | 2022-02-11 | 2023-07-28 | 行云新能科技(深圳)有限公司 | Edge computing system, edge computing decision prediction method, and computer-readable storage medium |
DE102022202384A1 (en) | 2022-03-10 | 2023-09-14 | Continental Automotive Technologies GmbH | Multi-access edge computing-based specific relative speed service |
CN116720663B (en) * | 2023-08-07 | 2023-11-10 | 创意(成都)数字科技有限公司 | Traffic operation management method, device, system and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120148103A1 (en) * | 2009-08-23 | 2012-06-14 | Iad Gesellschaft Fur Informatik, Automatisierung Und Datenverarbeitung Mbh | Method and system for automatic object detection and subsequent object tracking in accordance with the object shape |
US20180276485A1 (en) * | 2016-09-14 | 2018-09-27 | Nauto Global Limited | Systems and methods for safe route determination |
US20200166372A1 (en) * | 2018-11-27 | 2020-05-28 | International Business Machines Corporation | User interface |
US20200204614A1 (en) * | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Method for operating a decentralized computing network, in particular an edge cloud computer of the decentralized computing network |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2266208C (en) * | 1999-03-19 | 2008-07-08 | Wenking Corp. | Remote road traffic data exchange and intelligent vehicle highway system |
US20130289824A1 (en) | 2012-04-30 | 2013-10-31 | GM Global Technology Operations LLC | Vehicle turn assist system and method |
US9645250B2 (en) | 2015-04-07 | 2017-05-09 | GM Global Technology Operations LLC | Fail operational vehicle speed estimation through data fusion of 6-DOF IMU, GPS, and radar |
US20170083790A1 (en) * | 2015-09-23 | 2017-03-23 | Behavioral Recognition Systems, Inc. | Detected object tracker for a video analytics system |
US20180374341A1 (en) | 2017-06-27 | 2018-12-27 | GM Global Technology Operations LLC | Systems and methods for predicting traffic patterns in an autonomous vehicle |
CN108447291B (en) | 2018-04-03 | 2020-08-14 | 南京锦和佳鑫信息科技有限公司 | Intelligent road facility system and control method |
US11254325B2 (en) * | 2018-07-14 | 2022-02-22 | Moove.Ai | Vehicle-data analytics |
CN108922188B (en) | 2018-07-24 | 2020-12-29 | 河北德冠隆电子科技有限公司 | Radar tracking and positioning four-dimensional live-action traffic road condition perception early warning monitoring management system |
CN110928286B (en) * | 2018-09-19 | 2023-12-26 | 阿波罗智能技术(北京)有限公司 | Method, apparatus, medium and system for controlling automatic driving of vehicle |
US11553346B2 (en) * | 2019-03-01 | 2023-01-10 | Intel Corporation | Misbehavior detection in autonomous driving communications |
-
2019
- 2019-08-16 US US16/542,780 patent/US11574538B2/en active Active
-
2020
- 2020-01-24 US US16/751,804 patent/US20210049903A1/en not_active Abandoned
- 2020-08-14 CN CN202010818617.XA patent/CN112396828B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120148103A1 (en) * | 2009-08-23 | 2012-06-14 | Iad Gesellschaft Fur Informatik, Automatisierung Und Datenverarbeitung Mbh | Method and system for automatic object detection and subsequent object tracking in accordance with the object shape |
US20180276485A1 (en) * | 2016-09-14 | 2018-09-27 | Nauto Global Limited | Systems and methods for safe route determination |
US20200166372A1 (en) * | 2018-11-27 | 2020-05-28 | International Business Machines Corporation | User interface |
US20200204614A1 (en) * | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Method for operating a decentralized computing network, in particular an edge cloud computer of the decentralized computing network |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11956841B2 (en) | 2020-06-16 | 2024-04-09 | At&T Intellectual Property I, L.P. | Facilitation of prioritization of accessibility of media |
US11611448B2 (en) | 2020-06-26 | 2023-03-21 | At&T Intellectual Property I, L.P. | Facilitation of predictive assisted access to content |
US11902134B2 (en) | 2020-07-17 | 2024-02-13 | At&T Intellectual Property I, L.P. | Adaptive resource allocation to facilitate device mobility and management of uncertainty in communications |
US20220018677A1 (en) * | 2020-07-20 | 2022-01-20 | At&T Intellectual Property I, L.P. | Facilitation of predictive simulation of planned environment |
US11768082B2 (en) * | 2020-07-20 | 2023-09-26 | At&T Intellectual Property I, L.P. | Facilitation of predictive simulation of planned environment |
US20230095194A1 (en) * | 2021-09-30 | 2023-03-30 | AyDeeKay LLC dba Indie Semiconductor | Dynamic and Selective Pairing Between Proximate Vehicles |
Also Published As
Publication number | Publication date |
---|---|
US11574538B2 (en) | 2023-02-07 |
US20210049902A1 (en) | 2021-02-18 |
CN112396828B (en) | 2023-04-07 |
CN112396828A (en) | 2021-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11574538B2 (en) | Method and apparatus for perception-sharing between vehicles | |
US10531254B2 (en) | Millimeter wave vehicle-to-vehicle communication system for data sharing | |
JP6682629B2 (en) | Method and control system for identifying a traffic gap between two vehicles for vehicle lane change | |
US10229590B2 (en) | System and method for improved obstable awareness in using a V2X communications system | |
US10567923B2 (en) | Computation service for mobile nodes in a roadway environment | |
US8655575B2 (en) | Real time estimation of vehicle traffic | |
JP6841263B2 (en) | Travel plan generator, travel plan generation method, and control program | |
US11146918B2 (en) | Systems and methods for network node communication using dynamically configurable interaction modes | |
US11605298B2 (en) | Pedestrian navigation based on vehicular collaborative computing | |
US11350257B2 (en) | Proxy environmental perception | |
US10896609B2 (en) | Cooperative parking space search by a vehicular micro cloud | |
CN116057605A (en) | In-vehicle apparatus, information distribution apparatus, driving support system, control method, and computer program | |
US11489792B2 (en) | Vehicular micro clouds for on-demand vehicle queue analysis | |
CN110392396B (en) | Cloud-based network optimizer for connecting vehicles | |
WO2020147390A1 (en) | Vehicle control method and device | |
US20230247399A1 (en) | Adaptive sensor data sharing for a connected vehicle | |
US11485377B2 (en) | Vehicular cooperative perception for identifying a connected vehicle to aid a pedestrian | |
JP6880586B2 (en) | Information provision method and information provision device | |
CN116255973A (en) | Vehicle positioning | |
EP3903068B1 (en) | Learned intersection map from long term sensor data | |
US20220035365A1 (en) | Vehicular nano cloud | |
WO2021229671A1 (en) | Travel assistance device and travel assistance method | |
Farhat et al. | A novel cooperative collision avoidance system for vehicular communication based on deep learning | |
CN110896530A (en) | Method, apparatus, device and storage medium for transmitting and receiving data | |
WO2023058362A1 (en) | On-board device, vehicle system, server computer, control method, and computer program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHANG, SYCAMORE;QI, JIMMY;REEL/FRAME:051613/0409 Effective date: 20190813 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |