WO2021155568A1 - Configuration for prioritizing vehicle to everything communications - Google Patents

Configuration for prioritizing vehicle to everything communications Download PDF

Info

Publication number
WO2021155568A1
WO2021155568A1 PCT/CN2020/074498 CN2020074498W WO2021155568A1 WO 2021155568 A1 WO2021155568 A1 WO 2021155568A1 CN 2020074498 W CN2020074498 W CN 2020074498W WO 2021155568 A1 WO2021155568 A1 WO 2021155568A1
Authority
WO
WIPO (PCT)
Prior art keywords
priority
information
communication
identifies
vehicle
Prior art date
Application number
PCT/CN2020/074498
Other languages
French (fr)
Inventor
Lan Yu
Shailesh Patil
Hong Cheng
Dan Vassilovski
Gene Wesley Marsh
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to PCT/CN2020/074498 priority Critical patent/WO2021155568A1/en
Publication of WO2021155568A1 publication Critical patent/WO2021155568A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Definitions

  • aspects of the present disclosure generally relate to vehicle sensors, and more particularly to a sensor data sharing system for sharing sensor data between vehicles.
  • a vehicle may include a sensor system that includes one or more sensors to determine characteristics associated with the vehicle and/or characteristics associated with an environment of the vehicle.
  • a sensor system may be configured to detect proximity to an object, a weather condition, a road condition, a vehicle speed, a traffic condition, a location of the vehicle, and/or the like.
  • a method, performed by a device may include detecting an object within an environment of a vehicle; determining, based on detecting the object, a set of prioritization indicators of the object; determining, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and transmitting a communication associated with the object, according to the priority score, to share information associated with the object.
  • a device may include a memory and one or more processors operatively coupled to the memory.
  • the memory and the one or more processors may be configured to detect an object within an environment of a vehicle; determine, based on detecting the object, a set of prioritization indicators of the object; determine, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and transmit a communication associated with the object, according to the priority score, to share information associated with the object.
  • a non-transitory computer-readable medium may store one or more instructions.
  • the one or more instructions when executed by one or more processors of a device, may cause the one or more processors to detect an object within an environment of a vehicle; determine, based on detecting the object, a set of prioritization indicators of the object; determine, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and transmit a communication associated with the object, according to the priority score, to share information associated with the object.
  • an apparatus for wireless communication may include means for detecting an object within an environment of a vehicle; means for determining, based on detecting the object, a set of prioritization indicators of the object; means for determining, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and means for transmitting a communication associated with the object, according to the priority score, to share information associated with the object.
  • Fig. 1 is a diagram conceptually illustrating an example environment in which a sensor data sharing system described herein may be implemented, in accordance with various aspects of the present disclosure.
  • Fig. 2 is a diagram conceptually illustrating example components of one or more devices shown in Fig. 1, in accordance with various aspects of the present disclosure.
  • Figs. 3-6 are diagrams conceptually illustrating examples associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure.
  • Fig. 7 is a flowchart of an example process for sensor data sharing between vehicles.
  • vehicles e.g., via electronic control units (ECUs) of vehicles
  • ECUs electronice control units
  • vehicles may be configured to communicate with each other and/or other devices.
  • V2X vehicle-to-everything
  • V2V vehicle-to-vehicle
  • RSUs roadside units
  • vehicles may be configured to share sensor data that is generated by sensors onboard the respective vehicles.
  • a first vehicle may share sensor data with a second vehicle to enable the second vehicle to determine characteristics of an environment of the first vehicle (e.g., an environment surrounding the first vehicle) , a speed of the first vehicle, a location of the first vehicle, and/or the like.
  • characteristics of an environment of the first vehicle e.g., an environment surrounding the first vehicle
  • a speed of the first vehicle e.g., a speed of the first vehicle
  • a location of the first vehicle e.g., a location of the first vehicle, and/or the like.
  • the second vehicle may be able to determine a position of the first vehicle relative to the second vehicle, may be able to detect an object near the first vehicle that cannot be sensed by a sensor system of the second vehicle (e.g., due to the object being in a blind spot of the sensor system, due to the second vehicle being out of range of the object, and/or the like) , may be able to detect road hazards (e.g., potholes) that cannot be sensed by the sensor system of the second vehicle, and/or the like.
  • road hazards e.g., potholes
  • a vehicle typically encounters hundreds, thousands, millions, or more detectable objects as the vehicle travels along a roadway. Because there is a maximum amount of resources (e.g., computing resources, such as processing or memory resources, communication resources, and/or the like) that are available to a sensor data sharing system, in some instances, the sensor data sharing system may not be able to share sensor data associated with some detected objects. In such a case, if the detected object is hazardous and/or dangerous to the vehicle, to vulnerable road users (VRUs) (e.g., pedestrians) , and/or to other vehicles on the road, catastrophic events (e.g., vehicle collisions, pedestrian incidents, vehicle damage, and/or the like) can occur.
  • VRUs vulnerable road users
  • catastrophic events e.g., vehicle collisions, pedestrian incidents, vehicle damage, and/or the like
  • sharing relatively unimportant sensor data can be a waste of computing resources, communication resources, and/or the like (e.g., because sharing such data may not provide any utility under certain circumstances) .
  • a sensor data sharing system enables sharing of sensor data (e.g., via V2V communications, via a roadside platform, and/or the like) between vehicles according to a prioritization scheme and/or a determined priority associated with the sensor data.
  • the sensor data sharing system may detect an object, analyze the object to determine prioritization indicators, determine a priority score associated with the object, and transmit a communication (e.g., to other vehicles or RSUs) according to the priority score to share information associated with the object (e.g., sensor data associated with the object, identification information associated with the object, location information associated with the object, motion information associated with the object, and/or the like) .
  • the priority score may be representative of a frequency with which information (e.g., sensor data) associated with the object is to be shared with other vehicles, an amount of bandwidth of a V2X communication (which may include a V2V communication) that is to be utilized to share the information associated with the object, and/or the like.
  • information e.g., sensor data
  • V2X communication which may include a V2V communication
  • the sensor data sharing system may prioritize sharing of information associated with certain types of objects (e.g., hazardous objects, VRUs, and/or the like) over other types of objects (e.g., non-hazardous objects, non-VRUs, and/or the like) , may prioritize certain objects that have not been detected as recently over other objects that have been detected more recently, may prioritize certain objects that have not been identified by other vehicles or RSUs over other objects that have been identified by other vehicles or RSUs, and/or the like. Accordingly, the sensor data sharing system may prioritize objects that are determined to be more relative t and/or more important for V2X communication between vehicles, RSUs, and/or VRUs.
  • objects e.g., hazardous objects, VRUs, and/or the like
  • other types of objects e.g., non-hazardous objects, non-VRUs, and/or the like
  • the sensor data sharing system may prioritize objects that are determined to be more relative t and/or more important for V2X communication between vehicles, RSU
  • prioritizing sharing of sensor data for a particular object may include freeing up computing resources, communication resources, power resources, and/or the like.
  • the increase in available resources can permit the sensor data to be shared more quickly, more reliably, and/or the like relative to non-prioritized sensor data.
  • prioritizing the sharing of information (and/or sensor data) associated with certain objects over other objects can improve safety, prevent catastrophic events, conserve resources associated with sharing information associated with objects that should not be prioritized, and/or the like.
  • a sensor data sharing system may be configured within one or more ECUs of one or more vehicles and/or within one or more RSUs of a roadside platform that can be communicatively coupled with the vehicle.
  • Fig. 1 is a diagram of an example environment 100 in which systems and/or methods described herein may be implemented.
  • environment 100 may include a roadside platform 110 hosted via one or more computing resources 115 (referred to individually as a “computing resource 115” and collectively as “computing resources 115” ) of a cloud computing environment 120, one or more vehicles 130-1 to 130-N (referred to individually as a “vehicle 130” and collectively as “vehicles 130” ) with corresponding ECUs 132-1 to 132-N (referred to individually as a “ECU 132” and collectively as “ECUs 132” ) , and a network 140.
  • each of vehicles 130 is shown in Fig.
  • one or more vehicles 130 in environment 100 may include one or more ECUs 132.
  • Devices of environment 100 may interconnect via wired connections, wireless connections, or a combination of wired and wireless connections.
  • a sensor data sharing system is included in at least one of roadside platform 110 and/or one or more ECUs 132.
  • Roadside platform 110 includes one or more computing devices and/or communication devices (e.g., RSUs) assigned to receive, generate, process, and/or provide information associated with sensor data sharing, as described herein.
  • roadside platform 110 may be a platform implemented by cloud computing environment 120 that may receive sensor information associated with sensors of vehicles 130, distribute the sensor information to other vehicles 130 (e.g., vehicles coupled to RSUs of roadside platform 110) , and/or the like.
  • roadside platform 110 is implemented by computing resources 115 of cloud computing environment 120.
  • Roadside platform 110 may include a server device or a group of server devices.
  • roadside platform 110 may include one or more RSUs that include one or more server devices. Such RSUs may be configured along a roadway to permit communication with ECUs 132 of vehicles 130.
  • roadside platform 110 may be hosted in cloud computing environment 120.
  • roadside platform 110 may be non-cloud-based or may be partially cloud-based.
  • Cloud computing environment 120 includes an environment that delivers computing as a service, whereby shared resources, services, and/or the like may be provided to ECUs 132 of vehicles 130. Cloud computing environment 120 may provide computation, software, data access, storage, and/or other services that do not require end-user knowledge of a physical location and configuration of a system and/or a device that delivers the services. As shown, cloud computing environment 120 may include computing resources 115. Computing resources 115 may correspond to RSUs of roadside platform 110, as described herein. Computing resources 115 may be configured to form at least part of a sensor data sharing system, as described herein. Accordingly, computing resources 115 of roadside platform 110 may permit one or more capabilities of a sensor data sharing system to be supported in cloud computing environment 120.
  • Computing resource 115 includes one or more computers, server devices, or another type of computation and/or communication device.
  • computing resource 115 may host roadside platform 110.
  • the cloud resources may include compute instances executing in computing resource 115, storage devices provided in computing resource 115, data transfer devices provided by computing resource 115, and/or the like.
  • computing resource 115 may communicate with other computing resources 115 via wired connections, wireless connections, or a combination of wired and wireless connections.
  • computing resource 115 may include a group of cloud resources, such as one or more applications ( “APPs” ) 115-1, one or more virtual machines ( “VMs” ) 115-2, virtualized storage ( “VSs” ) 115-3, one or more hypervisors ( “HYPs” ) 115-4, or the like.
  • APPs applications
  • VMs virtual machines
  • VSs virtualized storage
  • HOPs hypervisors
  • Application 115-1 includes one or more software applications (e.g., software applications associated with a sensor data sharing system) that may be provided to or accessed by ECU 132. Application 115-1 may eliminate a need to install and execute the software applications on ECU 132. For example, application 115-1 may include software associated with roadside platform 110 and/or any other software capable of being provided via cloud computing environment 120. In some aspects, one application 115-1 may send/receive information to/from one or more other applications 115-1, via virtual machine 115-2.
  • software applications e.g., software applications associated with a sensor data sharing system
  • Virtual machine 115-2 includes a software aspect of a machine (e.g., a computer) that executes programs like a physical machine.
  • Virtual machine 115-2 may be either a system virtual machine or a process virtual machine, depending upon use and degree of correspondence to any real machine by virtual machine 115-2.
  • a system virtual machine may provide a complete system platform that supports execution of a complete operating system.
  • a process virtual machine may execute a single program and may support a single process.
  • virtual machine 115-2 may execute on behalf of a user (e.g., a user associated with vehicle 130) , and may manage infrastructure of cloud computing environment 120, such as data management, synchronization, or long-duration data transfers.
  • Virtualized storage 115-3 includes one or more storage systems and/or one or more devices that use virtualization techniques within the storage systems or devices of computing resource 115.
  • types of virtualizations may include block virtualization and file virtualization.
  • Block virtualization may refer to abstraction (or separation) of logical storage from physical storage so that the storage system may be accessed without regard to physical storage or heterogeneous structure. The separation may permit administrators of the storage system flexibility in how the administrators manage storage for end users.
  • File virtualization may eliminate dependencies between data accessed at a file level and a location where files are physically stored. This may enable optimization of storage use, server consolidation, and/or performance of non-disruptive file migrations.
  • virtualized storage 115-3 may store information associated with one or more sensor systems of one or more of vehicles 130 to permit a sensor data sharing system to determine priorities for detected objects in an environment of the one or more vehicles 130.
  • Hypervisor 115-4 provides hardware virtualization techniques that allow multiple operating systems (e.g., “guest operating systems” ) to execute concurrently on a host computer, such as computing resource 115.
  • Hypervisor 115-4 may present a virtual operating platform to the guest operating systems and may manage the execution of the guest operating systems. Multiple instances of a variety of operating systems may share virtualized hardware resources.
  • Vehicle 130 may include any vehicle that includes a sensor system as described herein.
  • vehicle 130 may be a consumer vehicle, an industrial vehicle, a commercial vehicle, and/or the like.
  • Vehicle 130 may be capable of traveling and/or providing transportation via public roadways, may be capable of use in operations associated with a worksite (e.g., a construction site) , and/or the like.
  • a worksite e.g., a construction site
  • two or more of vehicles 130 may travel in a platoon.
  • five vehicles 130-1 to 130-5 may be designated as a platoon according to any suitable techniques.
  • the platoon of vehicles 130-1 to 130-5 may be manually configured by a user. Additionally, or alternatively the platoon may be dynamically configured by one or more of the five vehicles 130-1 to 130-5. In such a case, based at least in part on the one or more of the five vehicles determining, from V2V communications with one another, that the five vehicles are traveling in a same direction, at relatively the same speed, and within a threshold distance of at least one of the other vehicles.
  • a first vehicle 130-1 in the platoon may be designated as a master vehicle that is to configure communication settings of the platoon of vehicles 130-1 to 130-5, including the remaining member vehicles 130-2 to 130-5.
  • Selection of the master vehicle for the platoon can be determined according to any suitable technique (e.g., based at least in part on position in the platoon, based at least in part on sensor capabilities of the master vehicle, based at least in part on communication capabilities of the master vehicle, based at least in part on processing capabilities of the master vehicle, and/or the like) .
  • a vehicle 130 may be controlled by ECU 132, which may include one or more devices capable of receiving, generating, storing, processing, and/or providing information associated with sensor sharing and/or a sensor data sharing system described herein.
  • ECU 132 may include a communication and/or computing device, such as an onboard computer, a control console, an operator station, or a similar type of device.
  • ECU 132 may include and/or be used to implement a sensor data sharing system, as described herein.
  • ECU 132 may permit the vehicle 130 to have one or more onboard capabilities associated with a sensor data sharing system (e.g., determining characteristics of a detected object, such as a type of object, a speed of the object, whether the object has been previously detected by the sensors and/or identified in received V2V communications, and/or the like) .
  • a sensor data sharing system e.g., determining characteristics of a detected object, such as a type of object, a speed of the object, whether the object has been previously detected by the sensors and/or identified in received V2V communications, and/or the like.
  • Network 140 includes one or more wired and/or wireless networks.
  • network 140 may include a cellular network (e.g., a long-term evolution (LTE) network, a code division multiple access (CDMA) network, a 3G network, a 4G network, a 5G network, another type of next generation network, etc.
  • LTE long-term evolution
  • CDMA code division multiple access
  • PLMN public land mobile network
  • LAN local area network
  • WAN wide area network
  • MAN metropolitan area network
  • PSTN Public Switched Telephone Network
  • the number and arrangement of devices and networks shown in Fig. 1 are provided as one or more examples. In practice, there may be additional devices and/or networks, fewer devices and/or networks, different devices and/or networks, or differently arranged devices and/or networks than those shown in Fig. 1. Furthermore, two or more devices shown in Fig. 1 may be implemented within a single device, or a single device shown in Fig. 1 may be implemented as multiple, distributed devices. Additionally, or alternatively, a set of devices (e.g., one or more devices) of environment 100 may perform one or more functions described as being performed by another set of devices of environment 100.
  • Fig. 2 is a diagram of example components of a device 200.
  • Device 200 may correspond to roadside platform 110, computing resource 115, ECU 132, and/or the like.
  • roadside platform 110, computing resource 115, and/or ECU 132 may include one or more devices 200 and/or one or more components of device 200.
  • device 200 may include a bus 210, a processor 220, a memory 230, a storage component 240, an input component 250, an output component 260, a communication interface 270, and one or more sensors 280 (referred to individually as a “sensor 280” and collectively as “sensors 280” ) .
  • sensors 280 referred to individually as a “sensor 280” and collectively as “sensors 280”
  • Bus 210 includes a component that permits communication among multiple components of device 200.
  • Processor 220 is implemented in hardware, firmware, and/or a combination of hardware and software.
  • Processor 220 is a central processing unit (CPU) , a graphics processing unit (GPU) , an accelerated processing unit (APU) , a microprocessor, a microcontroller, a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an application-specific integrated circuit (ASIC) , or another type of processing component.
  • processor 220 includes one or more processors capable of being programmed to perform a function.
  • Memory 230 includes a random access memory (RAM) , a read only memory (ROM) , and/or another type of dynamic or static storage device (e.g., a flash memory, a magnetic memory, and/or an optical memory) that stores information and/or instructions for use by processor 220.
  • RAM random access memory
  • ROM read only memory
  • static storage device e.g., a flash memory, a magnetic memory, and/or an optical memory
  • Storage component 240 stores information and/or software related to the operation and use of device 200.
  • storage component 240 may include a hard disk (e.g., a magnetic disk, an optical disk, and/or a magneto-optic disk) , a solid state drive (SSD) , a compact disc (CD) , a digital versatile disc (DVD) , a floppy disk, a cartridge, a magnetic tape, and/or another type of non-transitory computer-readable medium, along with a corresponding drive.
  • Input component 250 includes a component that permits device 200 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, and/or a microphone) . Additionally, or alternatively, input component 250 may include a component for determining location (e.g., a global positioning system (GPS) component) and/or a sensor (e.g., an accelerometer, a gyroscope, an actuator, another type of positional or environmental sensor, and/or the like) .
  • Output component 260 includes a component that provides output information from device 200 (via, e.g., a display, a speaker, a haptic feedback component, an audio or visual indicator, and/or the like) .
  • Communication interface 270 includes a transceiver-like component (e.g., a transceiver, a separate receiver, a separate transmitter, and/or the like) that enables device 200 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections.
  • Communication interface 270 may permit device 200 to receive information from another device and/or provide information to another device.
  • communication interface 270 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi interface, a cellular network interface, and/or the like.
  • RF radio frequency
  • USB universal serial bus
  • Device 200 may perform one or more processes described herein. Device 200 may perform these processes based at least in part on processor 220 executing software instructions stored by a non-transitory computer-readable medium, such as memory 230 and/or storage component 240.
  • a non-transitory computer-readable medium such as memory 230 and/or storage component 240.
  • computer-readable medium refers to a non-transitory memory device.
  • a memory device includes memory space within a single physical storage device or memory space spread across multiple physical storage devices.
  • Software instructions may be read into memory 230 and/or storage component 240 from another computer-readable medium or from another device via communication interface 270. When executed, software instructions stored in memory 230 and/or storage component 240 may cause processor 220 to perform one or more processes described herein. Additionally, or alternatively, hardware circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, aspects described herein are not limited to any specific combination of hardware circuitry and software.
  • Sensors 280 may include one or more devices capable of sensing one or more characteristics of an environment of device 200.
  • sensors 280 may include one or more of a camera, a light detection and ranging (LIDAR) sensor, a radio detection and ranging (RADAR) sensor, and/or the like.
  • sensors 280 may include any suitable sensors that may be configured within a sensor system to perform one or more operations, generate sensor data to permit one or more operations to be performed, and/or the like.
  • sensors 280 may be configured within a sensor system to detect the presence of one or more objects in an environment of device 200, detect a proximity to one or more objects in the environment of device 200, determine a location of device 200, determine a speed associated with a device 200, and/or the like.
  • sensor data generated by sensors 280 may be communicated (e.g., via communication interface 270) to another device to permit the sensor data to be used by the other device to perform one or more operations.
  • sensor 280 may include a magnetometer (e.g., a Hall effect sensor, an anisotropic magnetoresistive (AMR) sensor, a giant magneto-resistive sensor (GMR) , and/or the like) , a location sensor (e.g., a global positioning system (GPS) receiver, a local positioning system (LPS) device (e.g., that uses triangulation, multi-lateration, etc.
  • a magnetometer e.g., a Hall effect sensor, an anisotropic magnetoresistive (AMR) sensor, a giant magneto-resistive sensor (GMR) , and/or the like
  • AMR anisotropic magnetoresistive
  • GMR giant magneto-resistive sensor
  • LPS local positioning system
  • a gyroscope e.g., a micro-electro-mechanical systems (MEMS) gyroscope or a similar type of device
  • MEMS micro-electro-mechanical systems
  • device 200 includes means for performing one or more processes described herein and/or means for performing one or more operations of the processes described herein.
  • the means for performing the processes and/or operations described herein may include bus 210, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensors 280, and/or any combination thereof.
  • device 200 may include additional components, fewer components, different components, or differently arranged components than those shown in Fig. 2. Additionally, or alternatively, a set of components (e.g., one or more components) of device 200 may perform one or more functions described as being performed by another set of components of device 200.
  • a set of components e.g., one or more components
  • Fig. 3 is a diagram conceptually illustrating an example 300 associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure.
  • Example 300 includes a first vehicle (shown as “V1” ) and a second vehicle (shown as “V2” ) (which may be referred to herein collectively as “the vehicles” ) , a base station (e.g., a base station of network 140) , and a RSU (e.g., a RSU of roadside platform 110) .
  • the first vehicle and the second vehicle may correspond to vehicles 130 of Fig. 1.
  • the first vehicle and the second vehicle of example 300 may each include an ECU (corresponding to ECU 132) to facilitate communication with each other (e.g., V2V communication) , with other vehicles, with the base station, and/or with the RSU.
  • the first and/or second vehicle may prioritize sharing data sharing associated with certain objects detected by the first and/or second vehicle based on characteristics of the object (e.g., type, dynamic or static (moving or not moving) , and/or the like) , based on whether the object was detected previously (e.g., by either the first or second vehicle) , based on whether information associated with the object was previously shared, and/or whether a past communication associated with the object has occurred within a threshold time period or not occurred within a threshold time period.
  • characteristics of the object e.g., type, dynamic or static (moving or not moving) , and/or the like
  • an ECU of the vehicle may be performing the action.
  • the vehicles initiate platoon communication and designate a master vehicle.
  • the first vehicle and the second vehicle may be within a threshold distance of one another that permits V2V communication (e.g., via any suitable communication protocol) .
  • the first vehicle and the second vehicle may be associated with one or more designated groups of vehicles (e.g., one or more platoons that are configured to and/or capable of sharing sensor data, determining a priority associated with sharing information or processing information associated with a particular object, and/or the like) .
  • the first vehicle detects objects associated with the roadway.
  • the objects may include a speed limit sign along the side of the road and a pothole within the road.
  • the first vehicle may be detecting and/or processing sensor data associated with numerous other objects (e.g., including the second vehicle) .
  • the vehicles may detect an object according to any suitable technique.
  • the vehicles may use a computer vision technique, such as a convolutional neural network technique to assist in classifying sensor data and/or image data (e.g., image data generated from the sensor data that depicts one or more objects) into a particular class (e.g., VRUs, non-VRUs, hazardous objects, non-hazardous objects, mobile objects, stationary objects, and/or the like) .
  • a particular characteristic e.g., size, shape, and/or the like
  • the vehicles may determine that the objects do not have a particular characteristic and/or that the objects do not have a particular characteristic.
  • the vehicles may be configured to analyze sensor data and/or image data to determine whether an object represented in the sensor data and/or image data is associated with a particular type of object.
  • the computer vision technique may include using an image recognition technique (e.g., an Inception framework, a ResNet framework, a Visual Geometry Group (VGG) framework, and/or the like) , an object detection technique (e.g.
  • the computer vision technique may include an image processing technique configured to detect particular anatomical features of an individual (e.g., to prioritize the safety of the individual) .
  • the first vehicle may detect the objects along the roadway to permit the first vehicle to determine whether to prioritize sharing information and/or sensor data associated with one or more of the objects with the second vehicle.
  • the first vehicle shares information associated with the objects with the second vehicle according to a prioritization scheme.
  • the first vehicle may determine priorities associated with detected objects using a priority scoring system (e.g., corresponding to the prioritization scheme) .
  • the first vehicle may determine a priority score associated with a detected object, based on characteristics of the object, based on whether the objects are newly detected objects by the sensors of the vehicle, whether information associated with the objects has been previously shared by other vehicles (e.g., by the second vehicle) , whether the object is a VRU or non-VRU (e.g., an obstacle or other vehicle) , whether the object is moving or not moving, whether a status associated with the object has been shared with other vehicles (e.g., the second vehicle) within a recent threshold period of time, and/or the like.
  • the sensor data sharing system can apply weights (w) to prioritization indicators corresponding to values of the characteristics of the objects.
  • the weights may be adjustable based on the vehicle that is utilizing the scoring system.
  • the sensor data sharing system can determine (e.g., via one or more calculations associated with the scoring system) a priority score for a sensor system that is representative of a priority that is to be given with respect to sharing information associated with a detected object.
  • the sensor data sharing system can use the following to determine the priority score (s ij ) based on parameters a, b, c, ... (which may correspond to prioritization indicators of a particular object, and/or the like) of an object (or representative system) i for a vehicle (or representative vehicle) j:
  • prioritization indicators a i , b i , c i may include a value (e.g., a characteristic-specific value (e.g., a size, a speed, and/or the like) , a type specific value (e.g., according to a detection status of the object, and/or the like) , and/or the like) associated with a scale for the respective parameters a i , b i , c i .
  • a value e.g., a characteristic-specific value (e.g., a size, a speed, and/or the like)
  • a type specific value e.g., according to a detection status of the object, and/or the like
  • a higher priority (e.g., according to the priority score) for an object may correspond to information associated with the object being shared with another vehicle (or device) more frequently (e.g., every 100 milliseconds (ms) or faster, every 250 ms or faster, every 300 ms or faster, and/or the like) .
  • a lower priority (e.g., according to the priority score) for an object may correspond to information associated with the object being shared with another vehicle less frequently (e.g, every 400 ms or longer, every 500 ms or longer, every second or longer, and/or the like) .
  • a higher priority may correspond to a higher percentage of communication resources being made available to share information associated with the object (e.g., 25%, 30%, 40%or more of available data units of sensor sharing within a communication)
  • lesser priority for an object may correspond to a lower percentage of communication resources (e.g., 20%, 15%, 10%or less, including 0%) being available for objects associated with a lower priority.
  • the first vehicle may prioritize sharing information associated with the pothole over sharing information associated with the speed limit sign because the pothole is relatively more hazardous (e.g., based on location) to the first vehicle and/or to the second vehicle (e.g., the pothole is within the road, while the sign is on the side of the road) . Accordingly, if the second vehicle is unable to detect the pothole (e.g., due to the pothole being outside of a detectable range of sensors on the second vehicle) , the second vehicle may be notified, more urgently relative to the speed limit sign (which is also likely out of range) , that there is an upcoming pothole in the roadway.
  • the second vehicle may be notified, more urgently relative to the speed limit sign (which is also likely out of range) , that there is an upcoming pothole in the roadway.
  • a vehicle may prioritize sharing information associated with detected objects according to characteristics of the objects. In this way, more dangerous, more critical, and/or more relevant information associated with V2X communication can be shared between vehicles and/or RSUs.
  • Fig. 3 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 3.
  • Fig. 4 is a diagram conceptually illustrating an example 400 associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure.
  • Example 400 includes a prioritization scheme (or flow) that may be used by a sensor data sharing system to determine a priority associated with an object.
  • a first priority score of 0, 1, 2, or 3 is determined based on whether an object has been detected by the vehicle and/or indicated to a vehicle (shown as “Priority 0, ” “Priority 1, ” “Priority 2, ” or “Priority 3” ) , where 0 is the highest priority and 3 corresponds to the lowest priority) .
  • a sensor e.g., of a vehicle, of an RSU, and/or the like
  • the sensor data sharing system determines whether sensor data associated with the object has been received via a message (e.g., via a V2X communication, a V2V communication, and/or the like) .
  • the sensor data sharing system determines whether the object is a newly detected object. For example, the sensor data sharing system may determine whether the object been previously detected by the sensor data sharing system recently or within a most recent threshold time period (e.g., within the last 200 ms, within the last 300 ms, within the last 500 ms, and/or the like) . If the sensor data sharing system determines that the object is not a newly detected object, the sensor data sharing system assigns an initialization score of “3” to the object, at block 408. If the sensor data sharing system determines that the object is a newly detected object, the sensor data sharing system assigns an initialization score of “2” to the object, at block 410.
  • a most recent threshold time period e.g., within the last 200 ms, within the last 300 ms, within the last 500 ms, and/or the like
  • the sensor data sharing system may determine whether the object is a newly detected object at block 412 (e.g., similar to block 406) . If the sensor data sharing system determines that the object is not a newly detected object, the sensor data sharing system assigns an initialization score of “1” to the object at block 414. If the sensor data sharing system determines that the object is a newly detected object, the sensor data sharing system assigns an initialization score of “0” to the object at block 416. Accordingly, newly detected objects and/or objects that have not been identified from other communications are to be given the highest priority, while previously detected objects, and/or, objects that have been previously identified in objects are given less priority.
  • the assigned priority scores may be weighted according to one or more additional characteristics (or intergroup weighting) of the objects.
  • timing associated with whether sensor dat associated with the object has transmitted more recently may be used to weight the priority score.
  • the applied weight may be according to a scale that is based on a duration of time since a past communication that included the sensor data and/or information associated with the object. In this way, the timing of a previous transmission may influence the priority with respect to increasing or decreasing the priority associated with sharing information associated with the object.
  • an order of priority associated with a type of the object can be used to weight the initialization score.
  • the type of object may be a moving VRU ( “Dynamic VRU” ) , a different type of moving object (or non-VRU) ( “Dynamic other” ) , a stationary VRU ( “Stationary VRU” ) , a different type of stationary object ( “Stationary Other” ) .
  • the sensor data sharing system may use the prioritization scheme to prioritize objects to permit a vehicle to share information associated with higher priority objects more frequently, with more reliability, and/or with less latency, than lower priority objects.
  • Fig. 4 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 4.
  • Fig. 5 is a diagram conceptually illustrating an example associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure.
  • Example 500 includes three connected vehicles ( “CV-1, ” “CV-2, ” “CV-3” ) (referred to herein as “the connected vehicles” ) , a non-connected vehicle ( “NCV-1” ) , and pedestrians crossing a roadway, which includes an obstacle.
  • the connected vehicles are capable of V2V communication, but the non-connected vehicle is not capable of V2V communication.
  • Two frames 510 and 520 of a time period are shown in Fig. 5, with frame 510 occurring prior to frame 520.
  • Example 500 is described herein from the perspective of one of the connected vehicles, CV-1, which has a sensor detection range, as shown.
  • CV-1 may prioritize the objects in example 500 in the following order (highest priority to lowest priority) : (1) the pedestrians crossing the street (e.g., because the pedestrians are newly detected moving VRUs, and no previous related information was received) , (2) the obstacle (e.g., because the obstacle is newly detected but not moving and not a VRU) , and (3) CV-2 (e.g., because CV-2 was previously detected, and messages were previously received from CV-2) .
  • CV-1 may prioritize the objects in example 500 in the following order (highest priority to lowest priority) : (1) NCV-1 (e.g., because NCV-1 is a newly detected vehicle and no information associated with NCV-1 has been previously received) , (2) the pedestrians (e.g., because the pedestrians are previously detected VRUs and no previous information was received) , (3) the obstacle (e.g., because the obstacle was previously detected, is hazardous, and because no previous information was received) , (4) CV-3 (e.g., because newly detected, messages were previously received from CV-3) , (5) CV-2 (e.g., because CV-2 was previously detected and messages were previously received from CV-2) .
  • NCV-1 e.g., because NCV-1 is a newly detected vehicle and no information associated with NCV-1 has been previously received
  • the pedestrians e.g., because the pedestrians are previously detected VRUs and no previous information was received
  • the obstacle e.g., because the obstacle was previously detected, is hazardous, and because no previous information was received
  • Fig. 5 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 5.
  • Fig. 6 is a diagram conceptually illustrating an example associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure.
  • Example 600 includes three connected vehicles ( “CV-1, ” “CV-2, ” “CV-3” ) (referred to herein as “the connected vehicles” ) , two non-connected vehicles ( “NCV-1” and “NCV-2” ) , and an RSU. Similar to example 500, the connected vehicles are capable of V2V communication, but the non-connected vehicles are not capable of V2V communication. Two frames 610 and 620 of a time period are shown in Fig. 6, with frame 610 occurring prior to frame 620.
  • Example 600 is described herein from the perspective of one of the connected vehicles, CV-1, which has a sensor detection range, as shown.
  • the RSU may aggregate information from the connected vehicles (e.g., for perception extension and/or traffic flow optimization) .
  • the initialization priorities of the priority scheme in example 400 can be changed in connection with example 600.
  • priority 1 and priority 2 of example 400 may be swapped.
  • NCV-2 is a new detection by CV-1.
  • CV-1 may prioritize the objects in example 600 in the following order (highest priority to lowest priority) : (1) NCV-2 (e.g., because NCV-2 is a newly detected vehicle, no previous information was received) , (2) NCV-1 (e.g., because NCV-2 is a newly detected vehicle, previous information was received from RSU) , (3) pedestrians (e.g., because previously detected VRUs) , and (4) CV-2 (e.g., because previously detected vehicle, previously received messages from CV-2) .
  • NCV-1 and CV-3 are new detections by CV-1.
  • CV-1 may prioritize the objects in example 600 in the following order (highest priority to lowest priority) : (1) NCV-1 (e.g., because newly detected vehicle, no previous information received) , (2) CV-3 (e.g., because newly detected vehicle, previously received messages from CV-3) , (3) NCV-2 (e.g., because previously detected vehicle, no previous information received) , (4) pedestrians (e.g., because of previously detected, VRUs) , and (5) CV-1/CV-2 (previously detected vehicles, previously received messages from CV-1 and CV-2) .
  • NCV-1 e.g., because newly detected vehicle, no previous information received
  • CV-3 e.g., because newly detected vehicle, previously received messages from CV-3
  • NCV-2 e.g., because previously detected vehicle, no previous information received
  • pedestrians e.g., because of previously detected, VRUs
  • CV-1/CV-2 previously detected vehicles, previously received messages from CV-1 and CV-
  • Fig. 6 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 6.
  • Fig. 7 is a diagram illustrating an example process 700 performed, for example, by a sensor data sharing system, in accordance with various aspects of the present disclosure.
  • Example process 700 is an example where the sensor data sharing system (e.g., a sensor data sharing system of roadside platform 110, a sensor data sharing system of ECU 132, and/or the like) performs operations associated with a configuration for prioritizing vehicle to everything communications.
  • the sensor data sharing system e.g., a sensor data sharing system of roadside platform 110, a sensor data sharing system of ECU 132, and/or the like
  • process 700 may include detecting an object within an environment of a vehicle (block 710) .
  • the sensor data sharing system e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like
  • process 700 may include determining, based on detecting the object, a set of prioritization indicators of the object (block 720) .
  • the sensor data sharing system e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like
  • process 700 may include determining, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object (block 730) .
  • the sensor data sharing system e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like
  • process 700 may include transmitting a communication associated with the object, according to the priority score, to share information associated with the object (block 740) .
  • the sensor data sharing system e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like
  • Process 700 may include additional aspects, such as any single aspect or any combination of aspects described below and/or in connection with one or more other processes described elsewhere herein.
  • the set of prioritization indicators comprises at least one of: historical object detection information that identifies whether the object has been previously identified within the environment by a sensor of the vehicle, historical object communication information that identifies whether a communication associated with the object has been received from a communication device within the environment, characteristic information that identifies a type of the object, mobility information that identifies a mobility status of the object or the vehicle, or shared object information that identifies a timestamp of a past communication that shared previous information associated with the object.
  • the set of prioritization indicators includes historical object detection information
  • the priority score corresponds to a higher priority when the historical object detection information identifies that the object has not been previously identified by a sensor system of the vehicle
  • the priority score corresponds to a lower priority when the historical object detection information identifying that the object has been previously identified by the sensor system of the vehicle.
  • the set of prioritization indicators includes historical object communication information
  • the priority score corresponds to a higher priority when the historical object communication information identifies that no past communication associated with the object has been received from a communication device within the environment
  • the priority score corresponds to a lower priority when the historical object communication information identifies that a past communication associated with the object has been received from a communication device within the environment.
  • the set of prioritization indicators includes characteristic information associated with the object, the priority score corresponds to a higher priority when the characteristic information identifies that the object is a VRU, and the priority score corresponds to a lower priority when the characteristic information identifies that the object is not a VRU.
  • the set of prioritization indicators includes characteristic information associated with the object, the priority score corresponds to a higher priority when the characteristic information identifies that the object is likely to be a hazard, and the priority score corresponds to a lower priority when the characteristic information identifies that the object is not likely to be a hazard.
  • the set of prioritization indicators includes mobility information associated with the object, the priority score corresponds to a higher priority when the mobility information identifies that the object is moving, and the priority score corresponds to a lower priority when the mobility information identifies that the object is not moving.
  • the set of prioritization indicators includes previously shared object information
  • the priority score corresponds to a higher priority when the previously shared object information identifies that a past communication that shared previous information associated with the object has not been transmitted within a threshold time period
  • the priority score corresponds to a lower priority when the previously shared object information identifies that the past communication that shared previous information associated with the object has been transmitted within the threshold time period.
  • the priority score is determined based on a priority scoring system that applies corresponding weights to individual priority indicators of the set of priority indicators.
  • the priority score is associated with a frequency of transmitting the communication associated with the object.
  • the priority score is relative to priorities for transmitting other communications associated with other objects within the environment.
  • the communication is a vehicle-to-everything communication that is transmitted from the vehicle.
  • process 700 may include additional blocks, fewer blocks, different blocks, or differently arranged blocks than those depicted in Fig. 7. Additionally, or alternatively, two or more of the blocks of process 700 may be performed in parallel.
  • ком ⁇ онент is intended to be broadly construed as hardware, firmware, and/or a combination of hardware and software.
  • a processor is implemented in hardware, firmware, and/or a combination of hardware and software.
  • satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, greater than or equal to the threshold, less than the threshold, less than or equal to the threshold, equal to the threshold, not equal to the threshold, and/or the like.
  • “at least one of: a, b, or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, and c-c-c or any other ordering of a, b, and c) .
  • the phrase “only one” or similar language is used.
  • the terms “has, ” “have, ” “having, ” and/or the like are intended to be open-ended terms.
  • the phrase “based on” is intended to mean “based at least in part on” unless explicitly stated otherwise.
  • the term “or” is intended to be inclusive when used in a series and may be used interchangeably with “and/or, ” unless explicitly stated otherwise (e.g., if used in combination with “either” or “only one of” ) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

Various aspects of the present disclosure generally relate to sensor systems. In some aspects, a method may include detecting an object within an environment of a vehicle. The method may include determining, by the device and based on detecting the object, a set of prioritization indicators of the object. The method may include determining, by the device and based on the set of prioritization indicators, a priority score associated with sharing information associated with the object. The method may include transmitting, by the device, a communication associated with the object, according to the priority score, to share information associated with the object. Numerous other aspects are provided.

Description

CONFIGURATION FOR PRIORITIZING VEHICLE TO EVERYTHING COMMUNICATIONS
FIELD OF THE DISCLOSURE
Aspects of the present disclosure generally relate to vehicle sensors, and more particularly to a sensor data sharing system for sharing sensor data between vehicles.
BACKGROUND
A vehicle may include a sensor system that includes one or more sensors to determine characteristics associated with the vehicle and/or characteristics associated with an environment of the vehicle. For example, such a sensor system may be configured to detect proximity to an object, a weather condition, a road condition, a vehicle speed, a traffic condition, a location of the vehicle, and/or the like.
SUMMARY
In some aspects, a method, performed by a device, may include detecting an object within an environment of a vehicle; determining, based on detecting the object, a set of prioritization indicators of the object; determining, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and transmitting a communication associated with the object, according to the priority score, to share information associated with the object.
In some aspects, a device may include a memory and one or more processors operatively coupled to the memory. The memory and the one or more processors may be configured to detect an object within an environment of a vehicle; determine, based on detecting the object, a set of prioritization indicators of the object; determine, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and transmit a communication associated with the object, according to the priority score, to share information associated with the object.
In some aspects, a non-transitory computer-readable medium may store one or more instructions. The one or more instructions, when executed by one or more processors of a device, may cause the one or more processors to detect an object within  an environment of a vehicle; determine, based on detecting the object, a set of prioritization indicators of the object; determine, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and transmit a communication associated with the object, according to the priority score, to share information associated with the object.
In some aspects, an apparatus for wireless communication may include means for detecting an object within an environment of a vehicle; means for determining, based on detecting the object, a set of prioritization indicators of the object; means for determining, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and means for transmitting a communication associated with the object, according to the priority score, to share information associated with the object.
Aspects generally include a method, apparatus, system, computer program product, non-transitory computer-readable medium, user device, wireless communication device, and processing system as substantially described herein with reference to and as illustrated by the accompanying drawings and specification.
The foregoing has outlined rather broadly the features and technical advantages of examples according to the disclosure in order that the detailed description that follows may be better understood. Additional features and advantages will be described hereinafter. The conception and specific examples disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the scope of the appended claims. Characteristics of the concepts disclosed herein, both their organization and method of operation, together with associated advantages will be better understood from the following description when considered in connection with the accompanying figures. Each of the figures is provided for the purposes of illustration and description, and not as a definition of the limits of the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
So that the above-recited features of the present disclosure can be understood in detail, a more particular description, briefly summarized above, may be had by reference to aspects, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only certain typical aspects of this  disclosure and are therefore not to be considered limiting of its scope, for the description may admit to other equally effective aspects. The same reference numbers in different drawings may identify the same or similar elements.
Fig. 1 is a diagram conceptually illustrating an example environment in which a sensor data sharing system described herein may be implemented, in accordance with various aspects of the present disclosure.
Fig. 2 is a diagram conceptually illustrating example components of one or more devices shown in Fig. 1, in accordance with various aspects of the present disclosure.
Figs. 3-6 are diagrams conceptually illustrating examples associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure.
Fig. 7 is a flowchart of an example process for sensor data sharing between vehicles.
DETAILED DESCRIPTION
Various aspects of the disclosure are described more fully hereinafter with reference to the accompanying drawings. This disclosure may, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based at least in part on the teachings herein one skilled in the art should appreciate that the scope of the disclosure is intended to cover any aspect of the disclosure disclosed herein, whether implemented independently of or combined with any other aspect of the disclosure. For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein. In addition, the scope of the disclosure is intended to cover such an apparatus or method which is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the disclosure set forth herein. It should be understood that any aspect of the disclosure disclosed herein may be embodied by one or more elements of a claim.
In some instances, vehicles (e.g., via electronic control units (ECUs) of vehicles) may be configured to communicate with each other and/or other devices. For example, advances in communication technologies have enabled vehicle-to-everything  (V2X) communication, vehicle-to-vehicle (V2V) communication, and/or the like. Furthermore, one or more roadside units (RSUs) of a roadside platform may be configured to facilitate communication between vehicles, receive information associated with and/or from vehicles traveling along a roadway, provide information to and/or associated with vehicles traveling along a roadway, and/or the like. In such cases, vehicles may be configured to share sensor data that is generated by sensors onboard the respective vehicles. For example, a first vehicle may share sensor data with a second vehicle to enable the second vehicle to determine characteristics of an environment of the first vehicle (e.g., an environment surrounding the first vehicle) , a speed of the first vehicle, a location of the first vehicle, and/or the like. In this way, the second vehicle may be able to determine a position of the first vehicle relative to the second vehicle, may be able to detect an object near the first vehicle that cannot be sensed by a sensor system of the second vehicle (e.g., due to the object being in a blind spot of the sensor system, due to the second vehicle being out of range of the object, and/or the like) , may be able to detect road hazards (e.g., potholes) that cannot be sensed by the sensor system of the second vehicle, and/or the like.
A vehicle typically encounters hundreds, thousands, millions, or more detectable objects as the vehicle travels along a roadway. Because there is a maximum amount of resources (e.g., computing resources, such as processing or memory resources, communication resources, and/or the like) that are available to a sensor data sharing system, in some instances, the sensor data sharing system may not be able to share sensor data associated with some detected objects. In such a case, if the detected object is hazardous and/or dangerous to the vehicle, to vulnerable road users (VRUs) (e.g., pedestrians) , and/or to other vehicles on the road, catastrophic events (e.g., vehicle collisions, pedestrian incidents, vehicle damage, and/or the like) can occur. Moreover, sharing relatively unimportant sensor data (e.g., sensor data associated with objects that do not pose a threat to the vehicle, VRUs, or other vehicles) can be a waste of computing resources, communication resources, and/or the like (e.g., because sharing such data may not provide any utility under certain circumstances) .
According to some aspects described herein, a sensor data sharing system enables sharing of sensor data (e.g., via V2V communications, via a roadside platform, and/or the like) between vehicles according to a prioritization scheme and/or a determined priority associated with the sensor data. For example, the sensor data sharing system may detect an object, analyze the object to determine prioritization  indicators, determine a priority score associated with the object, and transmit a communication (e.g., to other vehicles or RSUs) according to the priority score to share information associated with the object (e.g., sensor data associated with the object, identification information associated with the object, location information associated with the object, motion information associated with the object, and/or the like) . The priority score may be representative of a frequency with which information (e.g., sensor data) associated with the object is to be shared with other vehicles, an amount of bandwidth of a V2X communication (which may include a V2V communication) that is to be utilized to share the information associated with the object, and/or the like.
According to some aspects, the sensor data sharing system may prioritize sharing of information associated with certain types of objects (e.g., hazardous objects, VRUs, and/or the like) over other types of objects (e.g., non-hazardous objects, non-VRUs, and/or the like) , may prioritize certain objects that have not been detected as recently over other objects that have been detected more recently, may prioritize certain objects that have not been identified by other vehicles or RSUs over other objects that have been identified by other vehicles or RSUs, and/or the like. Accordingly, the sensor data sharing system may prioritize objects that are determined to be more relative t and/or more important for V2X communication between vehicles, RSUs, and/or VRUs.
As described herein, prioritizing sharing of sensor data for a particular object may include freeing up computing resources, communication resources, power resources, and/or the like. The increase in available resources can permit the sensor data to be shared more quickly, more reliably, and/or the like relative to non-prioritized sensor data. Furthermore, as described herein, prioritizing the sharing of information (and/or sensor data) associated with certain objects over other objects can improve safety, prevent catastrophic events, conserve resources associated with sharing information associated with objects that should not be prioritized, and/or the like.
According to some aspects described herein, a sensor data sharing system may be configured within one or more ECUs of one or more vehicles and/or within one or more RSUs of a roadside platform that can be communicatively coupled with the vehicle.
Fig. 1 is a diagram of an example environment 100 in which systems and/or methods described herein may be implemented. As shown in Fig. 1, environment 100 may include a roadside platform 110 hosted via one or more computing resources 115 (referred to individually as a “computing resource 115” and collectively as “computing  resources 115” ) of a cloud computing environment 120, one or more vehicles 130-1 to 130-N (referred to individually as a “vehicle 130” and collectively as “vehicles 130” ) with corresponding ECUs 132-1 to 132-N (referred to individually as a “ECU 132” and collectively as “ECUs 132” ) , and a network 140. Although each of vehicles 130 is shown in Fig. 1 with one corresponding ECU 132 (e.g., the ECU 132 is collocated with the vehicle) , one or more vehicles 130 in environment 100 may include one or more ECUs 132. Devices of environment 100 may interconnect via wired connections, wireless connections, or a combination of wired and wireless connections. As described herein, a sensor data sharing system is included in at least one of roadside platform 110 and/or one or more ECUs 132.
Roadside platform 110 includes one or more computing devices and/or communication devices (e.g., RSUs) assigned to receive, generate, process, and/or provide information associated with sensor data sharing, as described herein. For example, roadside platform 110 may be a platform implemented by cloud computing environment 120 that may receive sensor information associated with sensors of vehicles 130, distribute the sensor information to other vehicles 130 (e.g., vehicles coupled to RSUs of roadside platform 110) , and/or the like. In some aspects, roadside platform 110 is implemented by computing resources 115 of cloud computing environment 120.
Roadside platform 110 may include a server device or a group of server devices. For example, roadside platform 110 may include one or more RSUs that include one or more server devices. Such RSUs may be configured along a roadway to permit communication with ECUs 132 of vehicles 130. In some aspects, roadside platform 110 may be hosted in cloud computing environment 120. Notably, while aspects described herein may describe roadside platform 110 as being hosted in cloud computing environment 120, in some aspects, roadside platform 110 may be non-cloud-based or may be partially cloud-based.
Cloud computing environment 120 includes an environment that delivers computing as a service, whereby shared resources, services, and/or the like may be provided to ECUs 132 of vehicles 130. Cloud computing environment 120 may provide computation, software, data access, storage, and/or other services that do not require end-user knowledge of a physical location and configuration of a system and/or a device that delivers the services. As shown, cloud computing environment 120 may include computing resources 115. Computing resources 115 may correspond to RSUs of  roadside platform 110, as described herein. Computing resources 115 may be configured to form at least part of a sensor data sharing system, as described herein. Accordingly, computing resources 115 of roadside platform 110 may permit one or more capabilities of a sensor data sharing system to be supported in cloud computing environment 120.
Computing resource 115 includes one or more computers, server devices, or another type of computation and/or communication device. In some aspects, computing resource 115 may host roadside platform 110. The cloud resources may include compute instances executing in computing resource 115, storage devices provided in computing resource 115, data transfer devices provided by computing resource 115, and/or the like. In some aspects, computing resource 115 may communicate with other computing resources 115 via wired connections, wireless connections, or a combination of wired and wireless connections.
As further shown in Fig. 1, computing resource 115 may include a group of cloud resources, such as one or more applications ( “APPs” ) 115-1, one or more virtual machines ( “VMs” ) 115-2, virtualized storage ( “VSs” ) 115-3, one or more hypervisors ( “HYPs” ) 115-4, or the like.
Application 115-1 includes one or more software applications (e.g., software applications associated with a sensor data sharing system) that may be provided to or accessed by ECU 132. Application 115-1 may eliminate a need to install and execute the software applications on ECU 132. For example, application 115-1 may include software associated with roadside platform 110 and/or any other software capable of being provided via cloud computing environment 120. In some aspects, one application 115-1 may send/receive information to/from one or more other applications 115-1, via virtual machine 115-2.
Virtual machine 115-2 includes a software aspect of a machine (e.g., a computer) that executes programs like a physical machine. Virtual machine 115-2 may be either a system virtual machine or a process virtual machine, depending upon use and degree of correspondence to any real machine by virtual machine 115-2. A system virtual machine may provide a complete system platform that supports execution of a complete operating system. A process virtual machine may execute a single program and may support a single process. In some aspects, virtual machine 115-2 may execute on behalf of a user (e.g., a user associated with vehicle 130) , and may manage  infrastructure of cloud computing environment 120, such as data management, synchronization, or long-duration data transfers.
Virtualized storage 115-3 includes one or more storage systems and/or one or more devices that use virtualization techniques within the storage systems or devices of computing resource 115. In some aspects, within the context of a storage system, types of virtualizations may include block virtualization and file virtualization. Block virtualization may refer to abstraction (or separation) of logical storage from physical storage so that the storage system may be accessed without regard to physical storage or heterogeneous structure. The separation may permit administrators of the storage system flexibility in how the administrators manage storage for end users. File virtualization may eliminate dependencies between data accessed at a file level and a location where files are physically stored. This may enable optimization of storage use, server consolidation, and/or performance of non-disruptive file migrations. In some aspects, virtualized storage 115-3 may store information associated with one or more sensor systems of one or more of vehicles 130 to permit a sensor data sharing system to determine priorities for detected objects in an environment of the one or more vehicles 130.
Hypervisor 115-4 provides hardware virtualization techniques that allow multiple operating systems (e.g., “guest operating systems” ) to execute concurrently on a host computer, such as computing resource 115. Hypervisor 115-4 may present a virtual operating platform to the guest operating systems and may manage the execution of the guest operating systems. Multiple instances of a variety of operating systems may share virtualized hardware resources.
Vehicle 130 may include any vehicle that includes a sensor system as described herein. For example, vehicle 130 may be a consumer vehicle, an industrial vehicle, a commercial vehicle, and/or the like. Vehicle 130 may be capable of traveling and/or providing transportation via public roadways, may be capable of use in operations associated with a worksite (e.g., a construction site) , and/or the like.
According to aspects described herein, two or more of vehicles 130 may travel in a platoon. As an example, five vehicles 130-1 to 130-5 may be designated as a platoon according to any suitable techniques. The platoon of vehicles 130-1 to 130-5 may be manually configured by a user. Additionally, or alternatively the platoon may be dynamically configured by one or more of the five vehicles 130-1 to 130-5. In such a case, based at least in part on the one or more of the five vehicles determining, from  V2V communications with one another, that the five vehicles are traveling in a same direction, at relatively the same speed, and within a threshold distance of at least one of the other vehicles. Further, a first vehicle 130-1 in the platoon may be designated as a master vehicle that is to configure communication settings of the platoon of vehicles 130-1 to 130-5, including the remaining member vehicles 130-2 to 130-5. Selection of the master vehicle for the platoon can be determined according to any suitable technique (e.g., based at least in part on position in the platoon, based at least in part on sensor capabilities of the master vehicle, based at least in part on communication capabilities of the master vehicle, based at least in part on processing capabilities of the master vehicle, and/or the like) .
vehicle 130 may be controlled by ECU 132, which may include one or more devices capable of receiving, generating, storing, processing, and/or providing information associated with sensor sharing and/or a sensor data sharing system described herein. For example, ECU 132 may include a communication and/or computing device, such as an onboard computer, a control console, an operator station, or a similar type of device. In some aspects, ECU 132 may include and/or be used to implement a sensor data sharing system, as described herein. For example, ECU 132 may permit the vehicle 130 to have one or more onboard capabilities associated with a sensor data sharing system (e.g., determining characteristics of a detected object, such as a type of object, a speed of the object, whether the object has been previously detected by the sensors and/or identified in received V2V communications, and/or the like) .
Network 140 includes one or more wired and/or wireless networks. For example, network 140 may include a cellular network (e.g., a long-term evolution (LTE) network, a code division multiple access (CDMA) network, a 3G network, a 4G network, a 5G network, another type of next generation network, etc. ) , a public land mobile network (PLMN) , a local area network (LAN) , a wide area network (WAN) , a metropolitan area network (MAN) , a telephone network (e.g., the Public Switched Telephone Network (PSTN) ) , a private network, an ad hoc network, an intranet, the Internet, a fiber optic-based network, a cloud computing network, or the like, and/or a combination of these or other types of networks.
The number and arrangement of devices and networks shown in Fig. 1 are provided as one or more examples. In practice, there may be additional devices and/or networks, fewer devices and/or networks, different devices and/or networks, or  differently arranged devices and/or networks than those shown in Fig. 1. Furthermore, two or more devices shown in Fig. 1 may be implemented within a single device, or a single device shown in Fig. 1 may be implemented as multiple, distributed devices. Additionally, or alternatively, a set of devices (e.g., one or more devices) of environment 100 may perform one or more functions described as being performed by another set of devices of environment 100.
Fig. 2 is a diagram of example components of a device 200. Device 200 may correspond to roadside platform 110, computing resource 115, ECU 132, and/or the like. In some aspects, roadside platform 110, computing resource 115, and/or ECU 132 may include one or more devices 200 and/or one or more components of device 200. As shown in Fig. 2, device 200 may include a bus 210, a processor 220, a memory 230, a storage component 240, an input component 250, an output component 260, a communication interface 270, and one or more sensors 280 (referred to individually as a “sensor 280” and collectively as “sensors 280” ) .
Bus 210 includes a component that permits communication among multiple components of device 200. Processor 220 is implemented in hardware, firmware, and/or a combination of hardware and software. Processor 220 is a central processing unit (CPU) , a graphics processing unit (GPU) , an accelerated processing unit (APU) , a microprocessor, a microcontroller, a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an application-specific integrated circuit (ASIC) , or another type of processing component. In some aspects, processor 220 includes one or more processors capable of being programmed to perform a function. Memory 230 includes a random access memory (RAM) , a read only memory (ROM) , and/or another type of dynamic or static storage device (e.g., a flash memory, a magnetic memory, and/or an optical memory) that stores information and/or instructions for use by processor 220.
Storage component 240 stores information and/or software related to the operation and use of device 200. For example, storage component 240 may include a hard disk (e.g., a magnetic disk, an optical disk, and/or a magneto-optic disk) , a solid state drive (SSD) , a compact disc (CD) , a digital versatile disc (DVD) , a floppy disk, a cartridge, a magnetic tape, and/or another type of non-transitory computer-readable medium, along with a corresponding drive.
Input component 250 includes a component that permits device 200 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a  keypad, a mouse, a button, a switch, and/or a microphone) . Additionally, or alternatively, input component 250 may include a component for determining location (e.g., a global positioning system (GPS) component) and/or a sensor (e.g., an accelerometer, a gyroscope, an actuator, another type of positional or environmental sensor, and/or the like) . Output component 260 includes a component that provides output information from device 200 (via, e.g., a display, a speaker, a haptic feedback component, an audio or visual indicator, and/or the like) .
Communication interface 270 includes a transceiver-like component (e.g., a transceiver, a separate receiver, a separate transmitter, and/or the like) that enables device 200 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections. Communication interface 270 may permit device 200 to receive information from another device and/or provide information to another device. For example, communication interface 270 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi interface, a cellular network interface, and/or the like.
Device 200 may perform one or more processes described herein. Device 200 may perform these processes based at least in part on processor 220 executing software instructions stored by a non-transitory computer-readable medium, such as memory 230 and/or storage component 240. As used herein, the term “computer-readable medium” refers to a non-transitory memory device. A memory device includes memory space within a single physical storage device or memory space spread across multiple physical storage devices.
Software instructions may be read into memory 230 and/or storage component 240 from another computer-readable medium or from another device via communication interface 270. When executed, software instructions stored in memory 230 and/or storage component 240 may cause processor 220 to perform one or more processes described herein. Additionally, or alternatively, hardware circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, aspects described herein are not limited to any specific combination of hardware circuitry and software.
Sensors 280 may include one or more devices capable of sensing one or more characteristics of an environment of device 200. For example, sensors 280 may  include one or more of a camera, a light detection and ranging (LIDAR) sensor, a radio detection and ranging (RADAR) sensor, and/or the like. Accordingly, sensors 280 may include any suitable sensors that may be configured within a sensor system to perform one or more operations, generate sensor data to permit one or more operations to be performed, and/or the like. For example, sensors 280 may be configured within a sensor system to detect the presence of one or more objects in an environment of device 200, detect a proximity to one or more objects in the environment of device 200, determine a location of device 200, determine a speed associated with a device 200, and/or the like. As described herein, sensor data generated by sensors 280 may be communicated (e.g., via communication interface 270) to another device to permit the sensor data to be used by the other device to perform one or more operations.
Additionally, or alternatively, sensor 280 may include a magnetometer (e.g., a Hall effect sensor, an anisotropic magnetoresistive (AMR) sensor, a giant magneto-resistive sensor (GMR) , and/or the like) , a location sensor (e.g., a global positioning system (GPS) receiver, a local positioning system (LPS) device (e.g., that uses triangulation, multi-lateration, etc. ) , and/or the like) , a gyroscope (e.g., a micro-electro-mechanical systems (MEMS) gyroscope or a similar type of device) , an accelerometer, a speed sensor, a motion sensor, an infrared sensor, a temperature sensor, a pressure sensor, and/or the like.
In some aspects, device 200 includes means for performing one or more processes described herein and/or means for performing one or more operations of the processes described herein. For example, the means for performing the processes and/or operations described herein may include bus 210, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensors 280, and/or any combination thereof.
The number and arrangement of components shown in Fig. 2 are provided as an example. In practice, device 200 may include additional components, fewer components, different components, or differently arranged components than those shown in Fig. 2. Additionally, or alternatively, a set of components (e.g., one or more components) of device 200 may perform one or more functions described as being performed by another set of components of device 200.
Fig. 3 is a diagram conceptually illustrating an example 300 associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure. Example 300 includes a first vehicle (shown as “V1” ) and a second vehicle  (shown as “V2” ) (which may be referred to herein collectively as “the vehicles” ) , a base station (e.g., a base station of network 140) , and a RSU (e.g., a RSU of roadside platform 110) . The first vehicle and the second vehicle may correspond to vehicles 130 of Fig. 1. Accordingly, the first vehicle and the second vehicle of example 300 may each include an ECU (corresponding to ECU 132) to facilitate communication with each other (e.g., V2V communication) , with other vehicles, with the base station, and/or with the RSU. As described herein, the first and/or second vehicle may prioritize sharing data sharing associated with certain objects detected by the first and/or second vehicle based on characteristics of the object (e.g., type, dynamic or static (moving or not moving) , and/or the like) , based on whether the object was detected previously (e.g., by either the first or second vehicle) , based on whether information associated with the object was previously shared, and/or whether a past communication associated with the object has occurred within a threshold time period or not occurred within a threshold time period.
As described herein, when referring to a vehicle of the platoon performing an action (e.g., receiving information, communicating with another entity, determining a reliability of a sensor system, enabling use of sensor data, using sensor data, and/or the like) , it is to be understood that an ECU of the vehicle may be performing the action.
As shown in Fig. 3, and by reference number 310, the vehicles initiate platoon communication and designate a master vehicle. For example, the first vehicle and the second vehicle may be within a threshold distance of one another that permits V2V communication (e.g., via any suitable communication protocol) . Additionally, or alternatively, the first vehicle and the second vehicle may be associated with one or more designated groups of vehicles (e.g., one or more platoons that are configured to and/or capable of sharing sensor data, determining a priority associated with sharing information or processing information associated with a particular object, and/or the like) .
As further shown in Fig. 3, and by reference number 320, the first vehicle detects objects associated with the roadway. For example, the objects may include a speed limit sign along the side of the road and a pothole within the road. Simultaneously, the first vehicle may be detecting and/or processing sensor data associated with numerous other objects (e.g., including the second vehicle) .
As described herein, the vehicles may detect an object according to any suitable technique. For example, the vehicles may use a computer vision technique,  such as a convolutional neural network technique to assist in classifying sensor data and/or image data (e.g., image data generated from the sensor data that depicts one or more objects) into a particular class (e.g., VRUs, non-VRUs, hazardous objects, non-hazardous objects, mobile objects, stationary objects, and/or the like) . More specifically, the vehicles may determine that an object has a particular characteristic (e.g., size, shape, and/or the like) . On the other hand, the vehicles may determine that the objects do not have a particular characteristic and/or that the objects do not have a particular characteristic.
Furthermore, the vehicles may be configured to analyze sensor data and/or image data to determine whether an object represented in the sensor data and/or image data is associated with a particular type of object. In some implementations, the computer vision technique may include using an image recognition technique (e.g., an Inception framework, a ResNet framework, a Visual Geometry Group (VGG) framework, and/or the like) , an object detection technique (e.g. a Single Shot Detector (SSD) framework, a You Only Look Once (YOLO) framework, a cascade classification technique (e.g., a Haar cascade technique, a boosted cascade, a local binary pattern technique, and/or the like) , and/or the like) , an edge detection technique, an object in motion technique (e.g., an optical flow framework and/or the like) , and/or the like. Additionally, or alternatively, the computer vision technique may include an image processing technique configured to detect particular anatomical features of an individual (e.g., to prioritize the safety of the individual) .
In this way, the first vehicle may detect the objects along the roadway to permit the first vehicle to determine whether to prioritize sharing information and/or sensor data associated with one or more of the objects with the second vehicle.
As further shown in Fig. 3, and by reference number 330, the first vehicle shares information associated with the objects with the second vehicle according to a prioritization scheme. For example, the first vehicle may determine priorities associated with detected objects using a priority scoring system (e.g., corresponding to the prioritization scheme) . More specifically, the first vehicle may determine a priority score associated with a detected object, based on characteristics of the object, based on whether the objects are newly detected objects by the sensors of the vehicle, whether information associated with the objects has been previously shared by other vehicles (e.g., by the second vehicle) , whether the object is a VRU or non-VRU (e.g., an obstacle or other vehicle) , whether the object is moving or not moving, whether a status  associated with the object has been shared with other vehicles (e.g., the second vehicle) within a recent threshold period of time, and/or the like. Using such a scoring system, the sensor data sharing system can apply weights (w) to prioritization indicators corresponding to values of the characteristics of the objects. Additionally, or alternatively, the weights may be adjustable based on the vehicle that is utilizing the scoring system. Accordingly, the sensor data sharing system can determine (e.g., via one or more calculations associated with the scoring system) a priority score for a sensor system that is representative of a priority that is to be given with respect to sharing information associated with a detected object.
In some aspects, the sensor data sharing system can use the following to determine the priority score (s ij) based on parameters a, b, c, … (which may correspond to prioritization indicators of a particular object, and/or the like) of an object (or representative system) i for a vehicle (or representative vehicle) j:
s ij = w aja i + w bjb i + w cjc i + …             (1)
where w aj, w bj, w cj, …correspond to adjusted weights for the prioritization indicators a i, b i, c i, …associated with the object i being analyzed by vehicle j. For example, prioritization indicators a i, b i, c i may include a value (e.g., a characteristic-specific value (e.g., a size, a speed, and/or the like) , a type specific value (e.g., according to a detection status of the object, and/or the like) , and/or the like) associated with a scale for the respective parameters a i, b i, c i. In some implementations, the adjusted weights w aj, w bj, w cj may be normalized (e.g., where 0 ≤ w aj, w bj, w cj ≤ 1 and w aj + w bj + w cj = 1) .
As described herein, a higher priority (e.g., according to the priority score) for an object may correspond to information associated with the object being shared with another vehicle (or device) more frequently (e.g., every 100 milliseconds (ms) or faster, every 250 ms or faster, every 300 ms or faster, and/or the like) . Meanwhile, a lower priority (e.g., according to the priority score) for an object may correspond to information associated with the object being shared with another vehicle less frequently (e.g, every 400 ms or longer, every 500 ms or longer, every second or longer, and/or the like) . Additionally, or alternatively, a higher priority may correspond to a higher percentage of communication resources being made available to share information associated with the object (e.g., 25%, 30%, 40%or more of available data units of sensor sharing within a communication) , while lesser priority for an object may correspond to a lower percentage of communication resources (e.g., 20%, 15%, 10%or less, including 0%) being available for objects associated with a lower priority.
Specifically referring to example 300 in Fig. 3, the first vehicle may prioritize sharing information associated with the pothole over sharing information associated with the speed limit sign because the pothole is relatively more hazardous (e.g., based on location) to the first vehicle and/or to the second vehicle (e.g., the pothole is within the road, while the sign is on the side of the road) . Accordingly, if the second vehicle is unable to detect the pothole (e.g., due to the pothole being outside of a detectable range of sensors on the second vehicle) , the second vehicle may be notified, more urgently relative to the speed limit sign (which is also likely out of range) , that there is an upcoming pothole in the roadway.
In this way, a vehicle may prioritize sharing information associated with detected objects according to characteristics of the objects. In this way, more dangerous, more critical, and/or more relevant information associated with V2X communication can be shared between vehicles and/or RSUs.
As indicated above, Fig. 3 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 3.
Fig. 4 is a diagram conceptually illustrating an example 400 associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure. Example 400 includes a prioritization scheme (or flow) that may be used by a sensor data sharing system to determine a priority associated with an object.
As shown in Fig. 4, a first priority score of 0, 1, 2, or 3 is determined based on whether an object has been detected by the vehicle and/or indicated to a vehicle (shown as “Priority 0, ” “Priority 1, ” “Priority 2, ” or “Priority 3” ) , where 0 is the highest priority and 3 corresponds to the lowest priority) . At block 402 of the prioritization scheme, a sensor (e.g., of a vehicle, of an RSU, and/or the like) detects an object. At block 404, the sensor data sharing system determines whether sensor data associated with the object has been received via a message (e.g., via a V2X communication, a V2V communication, and/or the like) . If the sensor data for the object has been received, at block 406, the sensor data sharing system determines whether the object is a newly detected object. For example, the sensor data sharing system may determine whether the object been previously detected by the sensor data sharing system recently or within a most recent threshold time period (e.g., within the last 200 ms, within the last 300 ms, within the last 500 ms, and/or the like) . If the sensor data sharing system determines that the object is not a newly detected object, the sensor data sharing system assigns an initialization score of “3” to the object, at block 408. If the sensor data sharing system  determines that the object is a newly detected object, the sensor data sharing system assigns an initialization score of “2” to the object, at block 410.
At block 404, if the sensor data sharing system determines that sensor data for the object has not been received in a message, the sensor data sharing system may determine whether the object is a newly detected object at block 412 (e.g., similar to block 406) . If the sensor data sharing system determines that the object is not a newly detected object, the sensor data sharing system assigns an initialization score of “1” to the object at block 414. If the sensor data sharing system determines that the object is a newly detected object, the sensor data sharing system assigns an initialization score of “0” to the object at block 416. Accordingly, newly detected objects and/or objects that have not been identified from other communications are to be given the highest priority, while previously detected objects, and/or, objects that have been previously identified in objects are given less priority.
As further shown in Fig. 4, the assigned priority scores may be weighted according to one or more additional characteristics (or intergroup weighting) of the objects. As shown by reference number 418, timing associated with whether sensor dat associated with the object has transmitted more recently may be used to weight the priority score. For example, the applied weight may be according to a scale that is based on a duration of time since a past communication that included the sensor data and/or information associated with the object. In this way, the timing of a previous transmission may influence the priority with respect to increasing or decreasing the priority associated with sharing information associated with the object. Additionally, or alternatively, as shown by reference number 420, an order of priority associated with a type of the object can be used to weight the initialization score. For example, the type of object may be a moving VRU ( “Dynamic VRU” ) , a different type of moving object (or non-VRU) ( “Dynamic other” ) , a stationary VRU ( “Stationary VRU” ) , a different type of stationary object ( “Stationary Other” ) .
In this way, the sensor data sharing system may use the prioritization scheme to prioritize objects to permit a vehicle to share information associated with higher priority objects more frequently, with more reliability, and/or with less latency, than lower priority objects.
As indicated above, Fig. 4 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 4.
Fig. 5 is a diagram conceptually illustrating an example associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure. Example 500, includes three connected vehicles ( “CV-1, ” “CV-2, ” “CV-3” ) (referred to herein as “the connected vehicles” ) , a non-connected vehicle ( “NCV-1” ) , and pedestrians crossing a roadway, which includes an obstacle. The connected vehicles are capable of V2V communication, but the non-connected vehicle is not capable of V2V communication. Two  frames  510 and 520 of a time period are shown in Fig. 5, with frame 510 occurring prior to frame 520. Example 500 is described herein from the perspective of one of the connected vehicles, CV-1, which has a sensor detection range, as shown.
As indicated by frame 510, the pedestrians and obstacle are new detections by CV-1. Using the prioritization scheme of Fig. 4, CV-1 may prioritize the objects in example 500 in the following order (highest priority to lowest priority) : (1) the pedestrians crossing the street (e.g., because the pedestrians are newly detected moving VRUs, and no previous related information was received) , (2) the obstacle (e.g., because the obstacle is newly detected but not moving and not a VRU) , and (3) CV-2 (e.g., because CV-2 was previously detected, and messages were previously received from CV-2) .
As indicated by frame 520, now CV-3 and NCV-1 are new detections by CV-1 (e.g., because CV-3 and NCV-1 came within the sensor detection range of CV-1) . Again, using the prioritization scheme from example 400, CV-1 may prioritize the objects in example 500 in the following order (highest priority to lowest priority) : (1) NCV-1 (e.g., because NCV-1 is a newly detected vehicle and no information associated with NCV-1 has been previously received) , (2) the pedestrians (e.g., because the pedestrians are previously detected VRUs and no previous information was received) , (3) the obstacle (e.g., because the obstacle was previously detected, is hazardous, and because no previous information was received) , (4) CV-3 (e.g., because newly detected, messages were previously received from CV-3) , (5) CV-2 (e.g., because CV-2 was previously detected and messages were previously received from CV-2) .
As indicated above, Fig. 5 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 5.
Fig. 6 is a diagram conceptually illustrating an example associated with sensor data sharing between vehicles in accordance with various aspects of the present disclosure. Example 600 includes three connected vehicles ( “CV-1, ” “CV-2, ” “CV-3” )  (referred to herein as “the connected vehicles” ) , two non-connected vehicles ( “NCV-1” and “NCV-2” ) , and an RSU. Similar to example 500, the connected vehicles are capable of V2V communication, but the non-connected vehicles are not capable of V2V communication. Two  frames  610 and 620 of a time period are shown in Fig. 6, with frame 610 occurring prior to frame 620. Example 600 is described herein from the perspective of one of the connected vehicles, CV-1, which has a sensor detection range, as shown.
The RSU may aggregate information from the connected vehicles (e.g., for perception extension and/or traffic flow optimization) . Using the RSU, the initialization priorities of the priority scheme in example 400 can be changed in connection with example 600. For example, in the following examples, priority 1 and priority 2 of example 400 may be swapped.
As indicated by frame 610, NCV-2 is a new detection by CV-1. Using the modified prioritization scheme described above, CV-1 may prioritize the objects in example 600 in the following order (highest priority to lowest priority) : (1) NCV-2 (e.g., because NCV-2 is a newly detected vehicle, no previous information was received) , (2) NCV-1 (e.g., because NCV-2 is a newly detected vehicle, previous information was received from RSU) , (3) pedestrians (e.g., because previously detected VRUs) , and (4) CV-2 (e.g., because previously detected vehicle, previously received messages from CV-2) .
As indicated by frame 620, NCV-1 and CV-3 are new detections by CV-1. Using the modified prioritization schemed described above, CV-1 may prioritize the objects in example 600 in the following order (highest priority to lowest priority) : (1) NCV-1 (e.g., because newly detected vehicle, no previous information received) , (2) CV-3 (e.g., because newly detected vehicle, previously received messages from CV-3) , (3) NCV-2 (e.g., because previously detected vehicle, no previous information received) , (4) pedestrians (e.g., because of previously detected, VRUs) , and (5) CV-1/CV-2 (previously detected vehicles, previously received messages from CV-1 and CV-2) .
As indicated above, Fig. 6 is provided merely as one or more examples. Other examples may differ from what is described with regard to Fig. 6.
Fig. 7 is a diagram illustrating an example process 700 performed, for example, by a sensor data sharing system, in accordance with various aspects of the present disclosure. Example process 700 is an example where the sensor data sharing  system (e.g., a sensor data sharing system of roadside platform 110, a sensor data sharing system of ECU 132, and/or the like) performs operations associated with a configuration for prioritizing vehicle to everything communications.
As shown in Fig. 7, in some aspects, process 700 may include detecting an object within an environment of a vehicle (block 710) . For example, the sensor data sharing system (e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like) may detect an object within an environment of a vehicle, as described above.
As further shown in Fig. 7, in some aspects, process 700 may include determining, based on detecting the object, a set of prioritization indicators of the object (block 720) . For example, the sensor data sharing system (e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like) may determine, based on detecting the object, a set of prioritization indicators of the object, as described above.
As further shown in Fig. 7, in some aspects, process 700 may include determining, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object (block 730) . For example, the sensor data sharing system (e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like) may determine, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object, as described above.
As further shown in Fig. 7, in some aspects, process 700 may include transmitting a communication associated with the object, according to the priority score, to share information associated with the object (block 740) . For example, the sensor data sharing system (e.g., using computing resource 115, ECU 132, processor 220, memory 230, storage component 240, input component 250, output component 260, communication interface 270, sensor 280 and/or the like) may transmit a communication associated with the object, according to the priority score, to share information associated with the object, as described above.
Process 700 may include additional aspects, such as any single aspect or any combination of aspects described below and/or in connection with one or more other processes described elsewhere herein.
In a first aspect, the set of prioritization indicators comprises at least one of: historical object detection information that identifies whether the object has been previously identified within the environment by a sensor of the vehicle, historical object communication information that identifies whether a communication associated with the object has been received from a communication device within the environment, characteristic information that identifies a type of the object, mobility information that identifies a mobility status of the object or the vehicle, or shared object information that identifies a timestamp of a past communication that shared previous information associated with the object.
In a second aspect, alone or in combination with the first aspect, the set of prioritization indicators includes historical object detection information, the priority score corresponds to a higher priority when the historical object detection information identifies that the object has not been previously identified by a sensor system of the vehicle, and the priority score corresponds to a lower priority when the historical object detection information identifying that the object has been previously identified by the sensor system of the vehicle.
In a third aspect, alone or in combination with one or more of the first and second aspects, the set of prioritization indicators includes historical object communication information, the priority score corresponds to a higher priority when the historical object communication information identifies that no past communication associated with the object has been received from a communication device within the environment, and the priority score corresponds to a lower priority when the historical object communication information identifies that a past communication associated with the object has been received from a communication device within the environment.
In a fourth aspect, alone or in combination with one or more of the first through third aspects, the set of prioritization indicators includes characteristic information associated with the object, the priority score corresponds to a higher priority when the characteristic information identifies that the object is a VRU, and the priority score corresponds to a lower priority when the characteristic information identifies that the object is not a VRU.
In a fifth aspect, alone or in combination with one or more of the first through fourth aspects, the set of prioritization indicators includes characteristic information associated with the object, the priority score corresponds to a higher priority when the characteristic information identifies that the object is likely to be a hazard, and the priority score corresponds to a lower priority when the characteristic information identifies that the object is not likely to be a hazard.
In a sixth aspect, alone or in combination with one or more of the first through fifth aspects, the set of prioritization indicators includes mobility information associated with the object, the priority score corresponds to a higher priority when the mobility information identifies that the object is moving, and the priority score corresponds to a lower priority when the mobility information identifies that the object is not moving.
In a seventh aspect, alone or in combination with one or more of the first through sixth aspects, the set of prioritization indicators includes previously shared object information, the priority score corresponds to a higher priority when the previously shared object information identifies that a past communication that shared previous information associated with the object has not been transmitted within a threshold time period, and the priority score corresponds to a lower priority when the previously shared object information identifies that the past communication that shared previous information associated with the object has been transmitted within the threshold time period.
In an eighth aspect, alone or in combination with one or more of the first through seventh aspects, the priority score is determined based on a priority scoring system that applies corresponding weights to individual priority indicators of the set of priority indicators. In a ninth aspect, alone or in combination with one or more of the first through eighth aspects, the priority score is associated with a frequency of transmitting the communication associated with the object.
In a tenth aspect, alone or in combination with one or more of the first through ninth aspects, the priority score is relative to priorities for transmitting other communications associated with other objects within the environment. In an eleventh aspect, alone or in combination with one or more of the first through tenth aspects, the communication is a vehicle-to-everything communication that is transmitted from the vehicle.
Although Fig. 7 shows example blocks of process 700, in some aspects, process 700 may include additional blocks, fewer blocks, different blocks, or differently arranged blocks than those depicted in Fig. 7. Additionally, or alternatively, two or more of the blocks of process 700 may be performed in parallel.
The foregoing disclosure provides illustration and description but is not intended to be exhaustive or to limit the aspects to the precise form disclosed. Modifications and variations may be made in light of the above disclosure or may be acquired from practice of the aspects.
As used herein, the term “component” is intended to be broadly construed as hardware, firmware, and/or a combination of hardware and software. As used herein, a processor is implemented in hardware, firmware, and/or a combination of hardware and software.
As used herein, satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, greater than or equal to the threshold, less than the threshold, less than or equal to the threshold, equal to the threshold, not equal to the threshold, and/or the like.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of various aspects. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one claim, the disclosure of various aspects includes each dependent claim in combination with every other claim in the claim set. A phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, and c-c-c or any other ordering of a, b, and c) .
No element, act, or instruction used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items and may be used interchangeably with “one or more. ” Further, as used herein, the article “the” is intended to include one or more items referenced in connection with the article “the” and may be used interchangeably with “the one or more. ” Furthermore, as used herein, the terms “set” and “group” are intended to include one or more items (e.g., related items, unrelated  items, a combination of related and unrelated items, and/or the like) , and may be used interchangeably with “one or more. ” Where only one item is intended, the phrase “only one” or similar language is used. Also, as used herein, the terms “has, ” “have, ” “having, ” and/or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based at least in part on” unless explicitly stated otherwise. Also, as used herein, the term “or” is intended to be inclusive when used in a series and may be used interchangeably with “and/or, ” unless explicitly stated otherwise (e.g., if used in combination with “either” or “only one of” ) .

Claims (30)

  1. A method, comprising:
    detecting, by a device, an object within an environment of a vehicle;
    determining, by the device and based on detecting the object, a set of prioritization indicators of the object;
    determining, by the device and based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and
    transmitting, by the device, a communication associated with the object, according to the priority score, to share information associated with the object.
  2. The method of claim 1, wherein the set of prioritization indicators comprises at least one of:
    historical object detection information that identifies whether the object has been previously identified within the environment by a sensor of the vehicle,
    historical object communication information that identifies whether a communication associated with the object has been received from a communication device within the environment,
    characteristic information that identifies a type of the object,
    mobility information that identifies a mobility status of the object or the vehicle, or
    shared object information that identifies a timestamp of a past communication that shared previous information associated with the object.
  3. The method of claim 1, wherein the set of prioritization indicators includes historical object detection information,
    wherein the priority score corresponds to a higher priority when the historical object detection information identifies that the object has not been previously identified by a sensor system of the vehicle, and
    wherein the priority score corresponds to a lower priority when the historical object detection information identifying that the object has been previously identified by the sensor system of the vehicle.
  4. The method of claim 1, wherein the set of prioritization indicators includes historical object communication information,
    wherein the priority score corresponds to a higher priority when the historical object communication information identifies that no past communication associated with the object has been received from a communication device within the environment, and
    wherein the priority score corresponds to a lower priority when the historical object communication information identifies that a past communication associated with the object has been received from a communication device within the environment.
  5. The method of claim 1, wherein the set of prioritization indicators includes characteristic information associated with the object, and
    wherein the priority score corresponds to a higher priority when the characteristic information identifies that the object is a vulnerable road user (VRU) , and
    wherein the priority score corresponds to a lower priority when the characteristic information identifies that the object is not a VRU.
  6. The method of claim 1, wherein the set of prioritization indicators includes characteristic information associated with the object, and
    wherein the priority score corresponds to a higher priority when the characteristic information identifies that the object is likely to be a hazard, and
    wherein the priority score corresponds to a lower priority when the characteristic information identifies that the object is not likely to be a hazard.
  7. The method of claim 1, wherein the set of prioritization indicators includes mobility information associated with the object, and
    wherein the priority score corresponds to a higher priority when the mobility information identifies that the object is moving, and
    wherein the priority score corresponds to a lower priority when the mobility information identifies that the object is not moving.
  8. The method of claim 1, wherein the set of prioritization indicators includes previously shared object information,
    wherein the priority score corresponds to a higher priority when the previously shared object information identifies that a past communication that shared previous information associated with the object has not been transmitted within a threshold time period, and
    wherein the priority score corresponds to a lower priority when the previously shared object information identifies that the past communication that shared previous information associated with the object has been transmitted within the threshold time period.
  9. The method of claim 1, wherein the priority score is determined based on a priority scoring system that applies corresponding weights to individual priority indicators of the set of priority indicators.
  10. The method of claim 1, wherein the priority score is associated with a frequency of transmitting the communication associated with the object.
  11. The method of claim 1, wherein the priority score is relative to priorities for transmitting other communications associated with other objects within the environment.
  12. The method of claim 1, wherein the communication is a vehicle-to-everything (V2X) communication that is transmitted from the vehicle.
  13. A device for vehicle to everything (V2X) communication, comprising:
    a memory; and
    one or more processors operatively coupled to the memory, the memory and the one or more processors configured to:
    detect an object within an environment of a vehicle;
    determine, based on detecting the object, a set of prioritization indicators of the object;
    determine, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and
    transmit a communication associated with the object, according to the priority score, to share information associated with the object.
  14. The device of claim 13, wherein the set of prioritization indicators comprises at least one of:
    historical object detection information that identifies whether the object has been previously identified within the environment by a sensor of the vehicle,
    historical object communication information that identifies whether a communication associated with the object has been received from a communication device within the environment,
    characteristic information that identifies a type of the object,
    mobility information that identifies a mobility status of the object or the vehicle, or
    shared object information that identifies a timestamp of a past communication that shared previous information associated with the object.
  15. The device of claim 13, wherein the set of prioritization indicators includes historical object detection information,
    wherein the priority score corresponds to a higher priority when the historical object detection information identifies that the object has not been previously identified by a sensor system of the vehicle, and
    wherein the priority score corresponds to a lower priority when the historical object detection information identifying that the object has been previously identified by the sensor system of the vehicle.
  16. The device of claim 13, wherein the set of prioritization indicators includes historical object communication information,
    wherein the priority score corresponds to a higher priority when the historical object communication information identifies that no past communication associated with the object has been received from a communication device within the environment, and
    wherein the priority score corresponds to a lower priority when the historical object communication information identifies that a past communication associated with the object has been received from a communication device within the environment.
  17. The device of claim 13, wherein the set of prioritization indicators includes characteristic information associated with the object, and
    wherein the priority score corresponds to a higher priority when the characteristic information identifies that the object is a vulnerable road user (VRU) , and
    wherein the priority score corresponds to a lower priority when the characteristic information identifies that the object is not a VRU.
  18. The device of claim 13, wherein the set of prioritization indicators includes characteristic information associated with the object, and
    wherein the priority score corresponds to a higher priority when the characteristic information identifies that the object is likely to be a hazard and
    wherein the priority score corresponds to a lower priority when the characteristic information identifies that the object is not likely to be a hazard.
  19. The device of claim 13, wherein the set of prioritization indicators includes mobility information associated with the object, and
    wherein the priority score corresponds to a higher priority when the mobility information identifies that the object is moving, and
    wherein the priority score corresponds to a lower priority when the mobility information identifies that the object is not moving.
  20. The device of claim 13, wherein the set of prioritization indicators includes previously shared object information,
    wherein the priority score corresponds to a higher priority when the previously shared object information identifies that a past communication that shared previous information associated with the object has not been transmitted within a threshold time period, and
    wherein the priority score corresponds to a lower priority when the previously shared object information identifies that the past communication that shared previous information associated with the object has been transmitted within the threshold time period.
  21. The device of claim 13, wherein the priority score is determined based on a priority scoring system that applies corresponding weights to individual priority indicators of the set of priority indicators.
  22. The device of claim 13, wherein the priority score is associated with a frequency of transmitting the communication associated with the object.
  23. The device of claim 13, wherein the priority score is relative to priorities for transmitting other communications associated with other objects within the environment.
  24. The device of claim 13, wherein the communication is a vehicle-to-everything (V2X) communication that is transmitted from the vehicle.
  25. A non-transitory computer-readable medium storing one or more instructions, the one or more instructions comprising:
    one or more instructions that, when executed by one or more processors of a device, cause the one or more processors to:
    detect an object within an environment of a vehicle;
    determine, based on detecting the object, a set of prioritization indicators of the object;
    determine, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and
    transmit a communication associated with the object, according to the priority score, to share information associated with the object.
  26. The non-transitory computer-readable medium of claim 25, wherein the set of prioritization indicators comprises at least one of:
    historical object detection information that identifies whether the object has been previously identified within the environment by a sensor of the vehicle,
    historical object communication information that identifies whether a communication associated with the object has been received from a communication device within the environment,
    characteristic information that identifies a type of the object,
    mobility information that identifies a mobility status of the object or the vehicle, or
    shared object information that identifies a timestamp of a past communication that shared previous information associated with the object.
  27. The non-transitory computer-readable medium of claim 25, wherein the priority score is determined based on a priority scoring system that applies corresponding weights to individual priority indicators of the set of priority indicators.
  28. An apparatus for wireless communication, comprising:
    means for detecting an object within an environment of a vehicle;
    means for determining, based on detecting the object, a set of prioritization indicators of the object;
    means for determining, based on the set of prioritization indicators, a priority score associated with sharing information associated with the object; and
    means for transmitting a communication associated with the object, according to the priority score, to share information associated with the object.
  29. The apparatus of claim 28, wherein the set of prioritization indicators comprises at least one of:
    historical object detection information that identifies whether the object has been previously identified within the environment by a sensor of the vehicle,
    historical object communication information that identifies whether a communication associated with the object has been received from a communication device within the environment,
    characteristic information that identifies a type of the object,
    mobility information that identifies a mobility status of the object or the vehicle, or
    shared object information that identifies a timestamp of a past communication that shared previous information associated with the object.
  30. The apparatus of claim 28, wherein the priority score is determined based on a priority scoring system that applies corresponding weights to individual priority indicators of the set of priority indicators.
PCT/CN2020/074498 2020-02-07 2020-02-07 Configuration for prioritizing vehicle to everything communications WO2021155568A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/074498 WO2021155568A1 (en) 2020-02-07 2020-02-07 Configuration for prioritizing vehicle to everything communications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/074498 WO2021155568A1 (en) 2020-02-07 2020-02-07 Configuration for prioritizing vehicle to everything communications

Publications (1)

Publication Number Publication Date
WO2021155568A1 true WO2021155568A1 (en) 2021-08-12

Family

ID=77199692

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/074498 WO2021155568A1 (en) 2020-02-07 2020-02-07 Configuration for prioritizing vehicle to everything communications

Country Status (1)

Country Link
WO (1) WO2021155568A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105225525A (en) * 2015-09-23 2016-01-06 宇龙计算机通信科技(深圳)有限公司 Information processing method, signal conditioning package and server
CA3028647A1 (en) * 2018-12-18 2019-03-18 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for processing traffic objects
WO2019212574A1 (en) * 2018-05-04 2019-11-07 Harman International Industries, Incorporated System and method for contextualizing objects in a vehicle horizon
US20190386913A1 (en) * 2018-06-13 2019-12-19 Futurewei Technologies, Inc. Multipath Selection System and Method for Datacenter-Centric Metro Networks
CN110728721A (en) * 2019-10-21 2020-01-24 北京百度网讯科技有限公司 Method, device and equipment for acquiring external parameters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105225525A (en) * 2015-09-23 2016-01-06 宇龙计算机通信科技(深圳)有限公司 Information processing method, signal conditioning package and server
WO2019212574A1 (en) * 2018-05-04 2019-11-07 Harman International Industries, Incorporated System and method for contextualizing objects in a vehicle horizon
US20190386913A1 (en) * 2018-06-13 2019-12-19 Futurewei Technologies, Inc. Multipath Selection System and Method for Datacenter-Centric Metro Networks
CA3028647A1 (en) * 2018-12-18 2019-03-18 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for processing traffic objects
CN110728721A (en) * 2019-10-21 2020-01-24 北京百度网讯科技有限公司 Method, device and equipment for acquiring external parameters

Similar Documents

Publication Publication Date Title
JP6037468B2 (en) Method for notifying that moving body is approaching specific area, and server computer and server computer program therefor
US10661795B1 (en) Collision detection platform
US9815475B2 (en) Analytics platform for identifying a roadway anomaly
US11398150B2 (en) Navigation analysis for a multi-lane roadway
US20210063546A1 (en) Distributed sensor calibration and sensor sharing using cellular vehicle-to-everything (cv2x) communication
US20180326907A1 (en) Vehicle collision avoidance
US20230102802A1 (en) Map change detection
US11727810B2 (en) Systems and methods for avoiding intersection collisions
WO2021155570A1 (en) Vehicle to vehicle communication control for vehicles in platoon
US10210751B1 (en) Identification of traffic control mechanisms using machine learning
US11580856B2 (en) Identification of a poorly parked vehicle and performance of a first group of actions to cause one or more other devices to perform a second group of actions
US11498580B2 (en) Method and device for facilitating manual operation of a vehicle
WO2021155568A1 (en) Configuration for prioritizing vehicle to everything communications
JPWO2019171437A1 (en) In-vehicle device, information processing method, and information processing program
US10953877B2 (en) Road condition prediction
US11673577B2 (en) System and methods of adaptive relevancy prediction for autonomous driving
WO2020199183A1 (en) Sensor data sharing between vehicles
CN108944921A (en) A kind of longitudinally controlled method and apparatus for vehicle
US11060875B2 (en) Automated identification of problematic locations for navigation and route guidance modification for dynamic alerting
JP6874456B2 (en) Anti-collision devices, communication systems, anti-collision methods, and computer programs
WO2022099603A1 (en) Radar interference mitigation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20917660

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20917660

Country of ref document: EP

Kind code of ref document: A1