EP3585029B1 - Fourniture de communications de données entre véhicules pour contenu multimédia - Google Patents

Fourniture de communications de données entre véhicules pour contenu multimédia Download PDF

Info

Publication number
EP3585029B1
EP3585029B1 EP19167176.7A EP19167176A EP3585029B1 EP 3585029 B1 EP3585029 B1 EP 3585029B1 EP 19167176 A EP19167176 A EP 19167176A EP 3585029 B1 EP3585029 B1 EP 3585029B1
Authority
EP
European Patent Office
Prior art keywords
vehicle
multimedia content
request message
content request
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP19167176.7A
Other languages
German (de)
English (en)
Other versions
EP3585029A1 (fr
Inventor
Edward Snow Willis
Kristian Neil Spriggs
Sameh Ayoub
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BlackBerry Ltd
Original Assignee
BlackBerry Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BlackBerry Ltd filed Critical BlackBerry Ltd
Publication of EP3585029A1 publication Critical patent/EP3585029A1/fr
Application granted granted Critical
Publication of EP3585029B1 publication Critical patent/EP3585029B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/46Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • H04W4/14Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • the present disclosure relates to providing inter-vehicle data communications for multimedia content.
  • a vehicle can include one or more sensors.
  • the one or more sensors can generate inputs, e.g., video or audio inputs, that reflect the surroundings of the vehicle.
  • Examples of the sensors can include cameras, microphones, laser, radar, ultrasonic, light detection and ranging (LIDAR) or any other sensors.
  • the vehicle may also include an autopilot processing platform that generate autopilot commands.
  • the autopilot processing platform can receive inputs from one or more sensors installed on the vehicle.
  • the autopilot processing platform may include one or more autopilot processors that generate autopilot commands based on these inputs.
  • These autopilot commands are directed to components of the vehicle to control the movements of the vehicle. Examples of the components include without limitation steering wheel, brakes, accelerator, lights, and the like. Examples of the autopilot commands include without limitation accelerate, decelerate, turn left or right, signal, and the like.
  • a vehicle equipped with the autopilot processing platform can be referred to as a self-driving vehicle, a driver-less vehicle, an autonomous or semi-autonomous vehicle, or an autopilot vehicle.
  • KR 2017 0081920 A The state of the art is represented by KR 2017 0081920 A .
  • vehicles can transmit data between one another.
  • one vehicle can transmit information about the traffic around the vehicle to another vehicle.
  • This information can include enviromental information such as streetlights, buildings, obstacles, cyclists, or pedestrians that are captured by the sensor of the vehicle.
  • This informaiton can also include information related to the driving actions of the vehicle, e.g., information of speed, acceleration, turning.
  • These data can be used for the recieving vehicle to perform autopilot processing and generate autopilot or semi-autopilot commands for the receiving vehicle.
  • Inter-vehicle data communications can be performed using vehicle-to-vehicle (V2V) communication protocols.
  • V2V refers to communication between any two Intelligent Transportation Service (ITS) capable vehicles.
  • ITS Intelligent Transportation Service
  • V2V enables transport entities, such as vehicles, to obtain and share information regarding their local environment in order to process and share knowledge for more intelligent transport related services or applications, for example, cooperative route planning, cooperative collision warning, or autonomous driving.
  • V2V can refer to services provided to a user equipment for communication supporting vehicle oriented services. Examples of vehicle oriented services include road safety, (transport) traffic efficiency, and other application services.
  • vehicle oriented services include road safety, (transport) traffic efficiency, and other application services.
  • the technology is also applicable to other types of transportation systems, their infrastructure and passengers, e.g., trains, track side signaling, passengers, aerial vehicles, drones, etc., and vehicles that can communicate with trackside signaling, e.g., cars at level crossings etc.
  • the inter-vehicle data communications can also be carried out using vehicle-to-infrastructure (V2I) communication protocols.
  • V2I vehicle-to-infrastructure
  • multimedia content including graphic images such as a picture or video of the road
  • the images of a congestion point can provide a good indication of the degree of congestion at these places.
  • cameras have been installed at fixed locations, such as highway exits or some busy intersections, and a vehicle or a driver can query relevant websites to receive images from these cameras.
  • V2V communication protocols can be used to facilitate the transmission of this graphic information between vehicles.
  • V2V messages have limited size, and therefore may not be able to carry image data that has a large payload. Therefore, V2V message can be used to convey multimedia content requests, while a different communication technology, e.g., Multimedia Messaging Service (MMS), can be used to transport the requested multimedia content.
  • MMS Multimedia Messaging Service
  • V2V communication protocols use short-range communication technologies to transmit information between vehicles that are close to each other, while the distance betwen the location of the vehicle that requests the multimedia content and the congestion location may be beyond the coverage range of a V2V message.
  • the multimedia content request can include location information that enables vehicles to relay the multimedia content request to different vehicles.
  • FIGS. 1-3 and associated descriptions provide additional details to these implementations.
  • FIG. 1 is a schematic diagram showing an example communication system 100 that provides inter-vehicle data communications for multimedia content, according to an implementation.
  • the example communication system 100 includes a first vehicle 120, a second vehicle 122, a third vehicle 124 that are communicatively coupled with each other.
  • the example communication system 100 also includes a server 130 that are communicatively coupled with the first vehicle 120, the second vehicle 122, and the third vehicle 124.
  • a vehicle e.g., the first vehicle 120, the second vehicle 122, and the third vehicle 124, can include a motor vehicle (e.g., automobile, car, truck, bus, motorcycle, etc.), aircraft (e.g., airplane, unmanned aerial vehicle, unmanned aircraft system, drone, helicopter, etc.), spacecraft (e.g., spaceplane, space shuttle, space capsule, space station, satellite, etc.), watercraft (e.g., ship, boat, hovercraft, submarine, etc.), railed vehicle (e.g., train, tram, etc.), and other types of vehicles including any combinations of any of the foregoing, whether currently existing or after arising.
  • a motor vehicle e.g., automobile, car, truck, bus, motorcycle, etc.
  • aircraft e.g., airplane, unmanned aerial vehicle, unmanned aircraft system, drone, helicopter, etc.
  • spacecraft e.g., spaceplane, space shuttle, space capsule, space station, satellite, etc.
  • watercraft e.g., ship,
  • the first vehicle 120 includes a camera 102, a vehicle component controller 104, a vehicular system processor 106, a communication subsystem 116, a user interface 118, memory 114, a navigation system 112, a location sensor 108 that are connected to a bus 110.
  • the second vehicle 122 and the third vehicle 124 can include similar components as the first vehicle 120.
  • the first vehicle 120 includes a camera 102. Although illustrated as a single camera 102 in FIG. 1 , the first vehicle 120 can include two or more cameras 102.
  • the camera 102 can include a lens, image processors, or other components that generate still images such as photos, or videos.
  • the first vehicle 120 includes a location sensor 108.
  • the location sensor 108 represents an application, a set of applications, software, software modules, hardware, or any combination thereof that can be configured to determine a current location of the first vehicle 120.
  • the location sensor 108 can be a Global Positioning System (GPS) receiver.
  • GPS Global Positioning System
  • the first vehicle 120 can include other sensors that detect or measure information for the first vehicle 120.
  • these sensors can include devices that capture environmental information that is external to the first vehicle 120, such as microphones, radars, laser transmitters and receivers, and the like. These sensors can provide environmental inputs for an autopilot processing platform operating on the first vehicle 120 to make autopilot decisions.
  • These sensors can also include devices that capture information that is internal to the first vehicle 120, such as monitors for components such as engine, battery, fuel, electronic system, cooling systems and the like. These sensors can provide operation status and warnings to the autopilot processing platform operating on the first vehicle 120.
  • the first vehicle 120 includes a vehicle component controller 104. Although illustrated as a vehicle component controller 104 in FIG. 1 , the first vehicle 120 can include two or more vehicle component controllers 104.
  • the vehicle component controller 104 represents a controller that controls the operation of a component on the first vehicle 120. Examples of the components can include engine, accelerator, brake, radiator, battery, steering wheel, transmission system, cooling system, electrical system, and any other components of the first vehicle 120.
  • the vehicle component controller 104 can operate a respective component automatically, according to input from the vehicular system processor 106, or a combination thereof.
  • the vehicle component controller 104 can include a data processing apparatus.
  • the navigation system 112 represents an application, a set of applications, software, software modules, hardware, or any combination thereof that can be configured to provide navigation information to the first vehicle 120.
  • the navigation system 112 can include a map application that generates a map, processes current traffic information, and calculates a route for the first vehicle 120.
  • the vehicular system processor 106 can include one or more processing components (alternatively referred to as “processors” or “central processing units” (CPUs)) configured to execute instructions related to one or more of the processes, steps, or actions for the autopilot processing platform operating on the first vehicle 120. Generally, the vehicular system processor 106 executes instructions and manipulates data to perform the operations of the driving processing platform. The vehicular system processor 106 can receive inputs from the sensors (including the camera 102 and the location sensor 108) and generate commands to the vehicle component controller 104. In some cases, the vehicular system processor 106 can perform autopilot operations.
  • the vehicular system processor 106 can perform operations including generating multimedia content request message, directing the camera 102 to generate multimedia content, and directing the communication subsystem 116 to transmit and receive multimedia content request message.
  • FIGS. 2-3 and associated descriptions provide additional details to these implementations.
  • the vehicular system processor 106 can include a data processing apparatus.
  • the communication subsystem 116 can be configured to provide wireless or wireline communication for data or control information provided by the vehicular system processor 106.
  • the communication subsystem 116 can support transmissions over wireless local area network (WLAN or WiFi), near field communication (NFC), infrared (IR), Radio-frequency identification (RFID), Bluetooth (BT), Universal Serial Bus (USB), or any other short-range communication protocols.
  • the communication subsystem 116 can also support Global System for Mobile communication (GSM), Interim Standard 95 (IS-95), Universal Mobile Telecommunications System (UMTS), CDMA2000 (Code Division Multiple Access), Evolved Universal Mobile Telecommunications System (E-UMTS), Long Term Evaluation (LTE), LTE-Advanced, 5G, or any other radio access technologies.
  • the communication subsystem 116 can include, for example, one or more antennas, a receiver, a transmitter, a local oscillator, a mixer, and a digital signal processing (DSP) unit.
  • the communication subsystem 116 can support multiple input multiple output (MIMO) transmissions.
  • the receivers in the communication subsystem 116 can be an advanced receiver or a baseline receiver.
  • the communication subsystem 116 can support inter-vehicle communication protocols, for example V2V communication protocols, to communicate with other vehicles, e.g., the second vehicle 122.
  • the user interface 118 can include, for example, any of the following: one or more of a display or touch screen display (for example, a liquid crystal display (LCD), a light emitting diode (LED), an organic light emitting diode (OLED), or a micro-electromechanical system (MEMS) display), a keyboard or keypad, a trackball, a speaker, or a microphone.
  • a display or touch screen display for example, a liquid crystal display (LCD), a light emitting diode (LED), an organic light emitting diode (OLED), or a micro-electromechanical system (MEMS) display
  • LCD liquid crystal display
  • LED light emitting diode
  • OLED organic light emitting diode
  • MEMS micro-electromechanical system
  • the user interface 118 can also include I/O interface, for example, a universal serial bus (USB) interface.
  • USB universal serial bus
  • the memory 114 can be a computer-readable storage medium. Examples of the memory 114 include volatile and non-volatile memory, magnetic media, optical media, random access memory (RAM), read-only memory (ROM), removable media, and others.
  • the memory 114 can store an operating system (OS) of the first vehicle 120 and various other computer-executable software programs for performing one or more of the processes, steps, or actions described above.
  • OS operating system
  • the bus 110 provides a communication interface for components of the autopilot processing platform operating on the first vehicle 120.
  • the bus 110 can be implemented using a Controller Area Network (CAN) bus.
  • CAN Controller Area Network
  • the navigation system 112 can be implemented on a portable electronic device that is connected with the first vehicle 120 over NFC, BT, USB or any other wireless or wireline communication technologies.
  • the portable electronic device may include, without limitation, any of the following: endpoint, computing device, mobile device, mobile electronic device, user device, mobile station, subscriber station, portable electronic device, mobile communications device, wireless modem, wireless terminal, or other electronic device.
  • Examples of an endpoint may include a mobile device, IoT (Internet of Things) device, EoT (Enterprise of Things) device, cellular phone, personal data assistant (PDA), smart phone, laptop, tablet, personal computer (PC), pager, portable computer, portable gaming device, wearable electronic device, health/medical/fitness device, camera, or other mobile communications devices having components for communicating voice or data via a wireless communication network.
  • IoT Internet of Things
  • EoT Enterprise of Things
  • cellular phone personal data assistant (PDA), smart phone, laptop, tablet, personal computer (PC), pager, portable computer, portable gaming device, wearable electronic device, health/medical/fitness device, camera, or other mobile communications devices having components for communicating voice or data via a wireless communication network.
  • PDA personal data assistant
  • portable computer portable gaming device
  • wearable electronic device wearable electronic device
  • health/medical/fitness device camera, or other mobile communications devices having components for communicating voice or data via a wireless communication network.
  • the server 130 represents an application, a set of applications, software, software modules, hardware, or any combination thereof that can be configured to transmit multimedia content between the first vehicles 120, the second vehicle 122, and the third vehicle 124.
  • the server 130 can be an MMS server that receives and transmits multimedia content according to MMS protocols.
  • the server 130 can also be a server that provides social media, email of other communicaiton services.
  • the server 130 can be implemented in a cloud computing platform.
  • the example communication system 100 includes the network 140.
  • the network 140 represents an application, set of applications, software, software modules, hardware, or combination thereof, that can be configured to transmit data between the server 130 and the vehicles in the system 100.
  • the network 140 includes a wireless network, a wireline network, or a combination thereof.
  • the network 140 can include one or a plurality of radio access networks (RANs), core networks (CNs), and external networks.
  • the RANs may comprise one or more radio access technologies.
  • the radio access technologies may be Global System for Mobile communication (GSM), Interim Standard 95 (IS-95), Universal Mobile Telecommunications System (UMTS), CDMA2000 (Code Division Multiple Access), Evolved Universal Mobile Telecommunications System (E-UMTS), Long Term Evaluation (LTE), LTE-Advanced, 5G, or any other radio access technologies.
  • GSM Global System for Mobile communication
  • UMTS Universal Mobile Telecommunications System
  • CDMA2000 Code Division Multiple Access
  • E-UMTS Evolved Universal Mobile Telecommunications System
  • LTE Long Term Evaluation
  • 5G Long Term Evolution-Advanced
  • the core networks may be evolved packet cores (EPCs).
  • a RAN is part of a wireless telecommunication system which implements a radio access technology, such as UMTS, CDMA2000, 3GPP LTE, 3GPP LTE-A, and 5G.
  • a RAN includes at least one base station.
  • a base station may be a radio base station that may control all or at least some radio-related functions in a fixed part of the system.
  • the base station may provide radio interface within their coverage area or a cell for a mobile device to communicate.
  • the base station may be distributed throughout the cellular network to provide a wide area of coverage.
  • the base station directly communicates to one or a plurality of mobile devices, other base stations, and one or more core network nodes.
  • FIG. 1 While elements of FIG. 1 are shown as including various component parts, portions, or modules that implement the various features and functionality, nevertheless, these elements may, instead, include a number of sub-modules, third-party services, components, libraries, and such, as appropriate. Furthermore, the features and functionality of various components can be combined into fewer components, as appropriate.
  • FIG. 2 is a flow diagram showing an example process 200 that provides inter-vehicle data communications for multimedia content, according to an implementation.
  • the process 200 can be implemented by the first vehicle 120, the second vehicle 122, and the third vehicle 124 as shown in FIG. 1 .
  • the process 200 shown in FIG. 2 can also be implemented using additional, fewer, or different entities.
  • the process 200 shown in FIG. 2 can also be implemented using additional, fewer, or different operations, which can be performed in the order shown or in a different order. In some instances, an operation or a group of the operations can be iterated or repeated, for example, for a specified number of iterations or until a terminating condition is reached.
  • the example process 200 begins at 210, wherein the first vehicle 120 transmits a multimedia content request message to the second vehicle 122.
  • the multimedia content request message includes a target location field.
  • the target location field indicates the target location where the multimedia content is requested.
  • the target location field can include longitude and latitude data of the target location, road names of intersection of the target location, highway exit number of the target location, or any combinations thereof.
  • the target location can be generated based on the current traffic on the road.
  • the first vehicle 120 can receive traffic status information from a navigation system.
  • the traffic status information can include current average driving speed of different segments on the road.
  • the first vehicle 120 can compare the current average driving speed with historical average driving speed.
  • the first vehicle 120 can determine whether a road segment is congested and the degree of the congestion based on the comparison.
  • the congested locations can be determined based on the congested road segments.
  • the congested location can be the starting point on a road segment where the current average driving speed is slower than the historical average driving speed by at least a threshold.
  • the first vehicle 120 can select one or more locations from these congested locations.
  • the first vehicle 120 can select the first congested location on its route. Alternatively or in combination, the first vehicle 120 can select the congested locations that are more severe, i.e., where the differences between the current average driving speed and the historical average driving speed are the largest.
  • the traffic status information can include location information of the congested locations, the degree of the congested locations, and a combination thereof.
  • the target location can be determined based on user inputs.
  • the first vehicle 120 can output the current traffic condition on a user interface, and the user can select a target location by a touch, tap, text input, voice input, or any other user interface input techniques. The user can select the target location according to current traffic, tourist attractions, or any other points of interest.
  • the multimedia content request message also includes a current location field that indicates the current location of the first vehicle 120.
  • the first vehicle 120 can use a location determination sensor, e.g., a GPS receiver, to determine the current location, and include location data of the current location, e.g., longitude and latitude data, in the current location field.
  • a location determination sensor e.g., a GPS receiver
  • the multimedia content request message can also include a contact information field.
  • the contact information field can include contact information associated with the first vehicle 120. Examples of the contact information can include a phone number, an email address, a social media handle, or other addressing information for the first vehicle 120 or a communication device associated with the first vehicle 120.
  • the multimedia content request message can comprise a V2V message, e.g., a broadcast, a multi-hop, or a packet forwarding V2V message.
  • the multimedia content request message can be transmitted between the first vehicle 120 and the second vehicle 122 using a V2V communication protocol.
  • the second vehicle 122 determines the current location of the second vehicle 122.
  • the current location of the second vehicle 122 can be determined using a location determination sensor.
  • the second vehicle 122 determines whether the current location of the second vehicle 122 matches the target location indicated by the multimedia content request message. In some implementations, the second vehicle 122 can compare the current location and the target location, and determine the distance between these two locations. If the distance is below a threshold, the second vehicle 122 can determine that its current location matches the target location.
  • the threshold can be configured at the second vehicle 122, e.g., by an owner of the second vehicle 122, the manufacturer of the second vehicle 122, a server, or any combinations thereof. Alternatively or in combination, the threshold can be included in the multimedia content request message.
  • the process 200 proceeds from 220 to 230, where the second vehicle 122 generates a multimedia content.
  • the multimedia content can include a video, a photo, audio, text, or any other types of graphic image data.
  • the type, the size (e.g., the number of images or the length of the video) of multimedia content can be indicated by the multimedia content request message.
  • the second vehicle 122 can use one or more sensors, e.g., cameras mounted at different parts of the second vehicle, to generate the multimedia content.
  • the process 200 proceeds to 235, where the second vehicle 122 transmits the multimedia content to the first vehicle 120.
  • the multimedia content can be transmitted using a communication protocol that can handle a large size of content.
  • the multimedia content can be transmitted using MMS.
  • the second vehicle can use the phone number of the first vehicle 120 that is included in the multimedia content request message to transmit the multimedia content using MMS.
  • the second vehicle 122 can also use other communication protocols, e.g., email or social media app to transmit the multimedia content, using the corresponding contact information of the first vehicle 120 in the multimedia content request message.
  • the first vehicle 120 can process the multimedia content to determine the degree of traffic congestion, and determine whether to change its route.
  • the first vehicle 120 can output the multimedia content on a user interface, and the user can determine whether to change the route accordingly.
  • the second vehicle 122 can transmit a multimedia content response message to the first vehicle 120.
  • the multimedia content response message can include contact information of the second vehicle 122.
  • the first vehicle 120 and the second vehicle 122 can further communicate with each other by using each other's contact information and determine how to transmit the multimedia content.
  • the multimedia content request message, the multimedia content response message, or a combination thereof can include security credentials, e.g., public keys or certificates, that are used to establish a secure communication channel for the transmission of the multimedia content.
  • the multimedia content response message can comprise a V2V message.
  • the process 200 proceeds from 220 to 240, where the second vehicle 122 determines whether to forward the multimedia content request message.
  • the second vehicle 122 can determine whether to forward the multimedia content request message based on the relative locations of the first vehicle 120, the second vehicle 122, and the target location. For example, if the second vehicle 122 determines that, comparing to the location of the first vehicle 120 as indicated by the current location field in the multimedia content request message, the current location of the second vehicle 122 is closer to the target location, the second vehicle 122 can determine to forward the multimedia content request message. Alternatively or in combination, the second vehicle 122 can determine whether the second vehicle 122 is on a route between the current location of the first vehicle 120 and determine whether to forward the multimedia content request message accordingly.
  • the process 200 proceeds from 240 to 245, where the second vehicle 122 forwards the multimedia content request message to the third vehicle 124.
  • the third vehicle can perform step 220, just as the second vehicle 122 has performed.
  • the third vehicle 124 can proceed to determine whether to generate and transmit multimedia content, or whether to continue to forward the multimedia content request message to a different vehicle.
  • the third vehicle 124 can transmit a multimedia content response message to the second vehicle 122, which would forward to the first vehicle 120. While the illustrated examples include three vehicles, more than three vehicles can participate in the request of multimedia content.
  • the multimedia content request message can be forwarded from one vehicle to another, until it is received by a vehicle at a location that matches the target location.
  • the process 200 proceeds from 240 to 242, where the second vehicle 122 refrains from forwarding the multimedia content request message. In some cases, the second vehicle 122 can delete the multimedia content request message.
  • the second vehicle 122 can determine whether or not to generate and transmit multimedia content further based on user inputs. For example, if the second vehicle 122 determines that the current location matches the target location, the second vehicle 122 can output a user input request on the user interface of the second vehicle 122.
  • the user input request can include any one or more of a graphic, text, or audio prompt.
  • the second vehicle 122 can also output other information of the multimedia content request, e.g., the current location and the contact information of the first vehicle 120, the target location, or a combination thereof, on the user interface.
  • the second vehicle 122 can receive user input through its user interface, indicating whether the generating and transmitting of the multimedia content is authorized.
  • the second vehicle 122 can proceed to step 230 if the user input indicates that the generating and transmitting of the multimedia content is authorized. Similarly, the second vehicle 122 can determine whether to further forward the multimedia content request message based on user input. Alternatively or in combination, whether to generate and transmit the multimedia content, or whether to forward the multimedia content request message can be configured at the second vehicle 122, e.g., by an owner of the second vehicle 122, the manufacturer of the second vehicle 122, a server, or any combinations thereof.
  • the multimedia content request message can indicate an end time. If the end time has reached, a vehicle that receives the multimedia content request message can discard the multimedia content request message without generating multimedia content or forwarding the multimedia content request message.
  • the multimedia content request message and the multimedia content response message can comprise V2V messages.
  • these messages can be conducted using V2I communication protocols.
  • the first vehicle 120 can transmit the multimedia content request message to a V2I server, and the V2I server can forward the multimedia content request message in a V2I message to the second vehicle 122.
  • FIG. 3 is a flow diagram showing an example method 300 that provides inter-vehicle data communications for multimedia content, according to an implementation.
  • the method 300 can be implemented by the entities shown in FIG. 1 , including, for example, the second vehicle 122.
  • the method 300 shown in FIG. 3 can also be implemented using additional, fewer, or different entities.
  • the method 300 shown in FIG. 3 can be implemented using additional, fewer, or different operations, which can be performed in the order shown or in a different order. In some instances, an operation or a group of operations can be iterated or repeated, for example, for a specified number of iterations or until a terminating condition is reached.
  • a first vehicle receives a multimedia content request message from a second vehicle.
  • the multimedia content request message indicates a target location at which multimedia content is requested.
  • a current location of the first vehicle is determined.
  • the multimedia content is generated.
  • the first vehicle transmits the multimedia content.
  • the first vehicle receives a second multimedia content request message.
  • the second multimedia content request message indicates a second target location at which second multimedia content is requested.
  • a second current location of the first vehicle is determined.
  • the second multimedia content request message is forwarded to a fourth vehicle.
  • Some of the subject matter and operations described in this disclosure can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures described in this disclosure and their structural equivalents, or in combinations of one or more of them.
  • Some of the subject matter described in this disclosure can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage medium for execution by, or to control the operation of, data-processing apparatus.
  • the program instructions can be encoded on an artificially generated propagated signal, for example, a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus for execution by a data processing apparatus.
  • the computer-storage medium can be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or any combinations of computer-storage mediums.
  • data-processing apparatus encompass all kinds of apparatus, devices, and machines for processing data, including, by way of example, a programmable processor, a computer, a system on a chip, or multiple ones, or combinations of the foregoing.
  • the apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • the data processing apparatus or special purpose logic circuitry may be hardware- or software-based (or a combination of both hardware- and software-based).
  • the apparatus can optionally include code that creates an execution environment for computer programs, for example, code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of execution environments.
  • code that constitutes processor firmware for example, code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of execution environments.
  • the present disclosure contemplates the use of data processing apparatuses with or without conventional operating systems, for example LINUX, UNIX, WINDOWS, MAC OS, ANDROID, IOS, or any other suitable, conventional operating system.
  • a computer program which may also be referred to, or described, as a program, software, a software application, a module, a software module, a script, or code, can be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data, for example, one or more scripts stored in a markup language document, in a single file dedicated to the program in question, or in multiple coordinated files, for example, files that store one or more modules, sub-programs, or portions of code.
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site, or distributed across multiple sites and interconnected by a communication network. While portions of the programs illustrated in the various figures are shown as individual modules that implement the various features and functionality through various objects, methods, or other processes, the programs may instead include a number of sub-modules, third-party services, components, libraries, and such, as appropriate. Conversely, the features and functionality of various components can be combined into single components, as appropriate.
  • Some of the processes and logic flows described in this disclosure can be performed by one or more programmable processors, executing one or more computer programs to perform actions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random-access memory, or both.
  • a processor can include by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations of the foregoing.
  • a processor can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • Computers suitable for the execution of a computer program can be based on general or special purpose microprocessors, both, or any other kind of CPU.
  • a CPU will receive instructions and data from a read-only memory (ROM) or a random-access memory (RAM), or both.
  • the essential elements of a computer are a CPU, for performing or executing instructions, and one or more memory devices, for storing instructions and data.
  • a computer will also include, or be operatively coupled to, receive data from or transfer data to, or both, one or more mass storage devices for storing data, for example, magnetic, magneto-optical disks, or optical disks.
  • mass storage devices for storing data, for example, magnetic, magneto-optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer can be embedded in another device, for example, a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a global positioning system (GPS) receiver, or a portable storage device, for example, a universal serial bus (USB) flash drive, to name just a few.
  • PDA personal digital assistant
  • GPS global positioning system
  • USB universal serial bus
  • Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including, by way of example, semiconductor memory devices, for example, erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and flash memory devices; magnetic disks, for example, internal hard disks or removable disks; magneto-optical disks; and CD-ROM, DVD+/-R, DVD-RAM, and DVD-ROM disks.
  • semiconductor memory devices for example, erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and flash memory devices
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory devices for example, internal hard disks or removable disks
  • magneto-optical disks magneto-optical disks
  • the memory may store various objects or data, including caches, classes, frameworks, applications, backup data, jobs, web pages, web page templates, database tables, repositories storing dynamic information, and any other appropriate information including any parameters, variables, algorithms, instructions, rules, constraints, or references thereto. Additionally, the memory may include any other appropriate data, such as logs, policies, security or access data, reporting files, as well as others.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • the computer storage medium can be transitory, non-transitory, or a combination thereof.
  • implementations of the subject matter described in this disclosure can be implemented on a computer having a display device, for example, a CRT (cathode ray tube), LCD (liquid crystal display), LED (Light Emitting Diode), or plasma monitor, for displaying information to the user and a keyboard and a pointing device, for example, a mouse, trackball, or trackpad by which the user can provide input to the computer.
  • a display device for example, a CRT (cathode ray tube), LCD (liquid crystal display), LED (Light Emitting Diode), or plasma monitor
  • a keyboard and a pointing device for example, a mouse, trackball, or trackpad by which the user can provide input to the computer.
  • Input may also be provided to the computer using a touchscreen, such as a tablet computer surface with pressure sensitivity, a multi-touch screen using capacitive or electric sensing, or other type of touchscreen.
  • a computer can interact with a user by sending documents to, and receiving documents from a device that is used by the user, for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
  • GUI graphical user interface
  • GUI may be used in the singular or the plural to describe one or more graphical user interfaces and each of the displays of a particular graphical user interface. Therefore, a GUI may represent any graphical user interface, including but not limited to, a web browser, a touch screen, or a command line interface (CLI) that processes information and efficiently presents the information results to the user.
  • a GUI may include a plurality of user interface (UI) elements, some or all associated with a web browser, such as interactive fields, pull-down lists, and buttons operable by the business suite user. These and other UI elements may be related to or represent the functions of the web browser.
  • UI user interface
  • Implementations of the subject matter described in this disclosure can be implemented in a computing system that includes a back-end component, for example, as a data server, or that includes a middleware component, for example, an application server, or that includes a front-end component, for example, a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this disclosure, or any combination of one or more such back-end, middleware, or front-end components.
  • the components of the system can be interconnected by any form or medium of wireline or wireless digital data communication (or a combination of data communication), for example, a communication network.
  • Examples of communication networks include a local area network (LAN), a radio access network (RAN), a metropolitan area network (MAN), a wide area network (WAN), Worldwide Interoperability for Microwave Access (WIMAX), a wireless local area network (WLAN) using, for example, 802.11 a/b/g/n or 802.20 (or a combination of 802.11x and 802.20 or other protocols consistent with this disclosure), all or a portion of the Internet, or any other communication system, or systems at one or more locations (or a combination of communication networks).
  • the network may communicate with, for example, Internet Protocol (IP) packets, Frame Relay frames, Asynchronous Transfer Mode (ATM) cells, voice, video, data, or other suitable information (or a combination of communication types) between network addresses.
  • IP Internet Protocol
  • ATM Asynchronous Transfer Mode
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • any or all of the components of the computing system may interface with each other, or the interface using an application programming interface (API), or a service layer (or a combination of API and service layer).
  • API application programming interface
  • the API may include specifications for routines, data structures, and object classes.
  • the API may be either computer language, independent or dependent, and refer to a complete interface, a single function, or even a set of APIs.
  • the service layer provides software services to the computing system. The functionality of the various components of the computing system may be accessible for all service consumers using this service layer.
  • Software services provide reusable, defined business functionalities through a defined interface.
  • the interface may be software written in JAVA, C++, or other suitable language providing data in extensible markup language (XML) format or other suitable format.
  • the API or service layer (or a combination of the API and the service layer) may be an integral or a stand-alone component in relation to other components of the computing system.
  • any or all parts of the service layer may be implemented as child or sub-modules of another software module, or hardware module without departing from the scope of this disclosure.
  • any claimed implementation below is considered to be applicable to at least a computer-implemented method; a non-transitory, computer-readable medium storing computer-readable instructions to perform the computer-implemented method; and a computer system comprising a computer memory interoperably coupled with a hardware processor configured to perform the computer-implemented method or the instructions stored on the computer-readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Traffic Control Systems (AREA)

Claims (15)

  1. Un procédé (200), comprenant :
    la réception, au niveau d'un premier véhicule (122), d'un message de demande de contenu multimédia (210) en provenance d'un véhicule source (120), le message de demande de contenu multimédia indiquant un emplacement cible au niveau duquel un contenu multimédia est demandé ;
    la détermination (215) d'un emplacement actuel du premier véhicule (122) ;
    en réponse à la détermination (220) du fait que l'emplacement actuel correspond à l'emplacement cible, la génération (230) du contenu multimédia de l'emplacement cible et la transmission, par le premier véhicule (122), du contenu multimédia (235) au véhicule source ; caractérisé par :
    en réponse à la détermination (220) du fait que l'emplacement actuel ne correspond pas à l'emplacement cible, la détermination (240), par le premier véhicule, quant à savoir s'il faut réacheminer le message de demande de contenu multimédia vers un deuxième véhicule (124) ; et
    en réponse à la détermination du fait qu'il faut réacheminer le message de demande de contenu multimédia vers le deuxième véhicule (124), le réacheminement (245) du message de demande de contenu multimédia vers le deuxième véhicule (124).
  2. Le procédé de la revendication 1, dans lequel le contenu multimédia comprend au moins un élément parmi une photo ou une vidéo.
  3. Le procédé de la revendication 1 ou de la revendication 2, dans lequel le message de demande de contenu multimédia comprend des informations de contact du véhicule source qui demande le contenu multimédia, et le contenu multimédia est transmis au véhicule source à l'aide des informations de contact du véhicule source.
  4. Le procédé de la revendication 3, dans lequel le message de demande de contenu multimédia est transmis à l'aide d'un message V2V, véhicule à véhicule, et le contenu multimédia est transmis à l'aide d'un message MMS, service de messagerie multimédia.
  5. Le procédé de n'importe quelle revendication précédente, comprenant en outre :
    la réception, au niveau du premier véhicule, d'un deuxième message de demande de contenu multimédia, le deuxième message de demande de contenu multimédia indiquant un deuxième emplacement cible au niveau duquel un deuxième contenu multimédia est demandé ;
    la détermination d'un deuxième emplacement actuel du premier véhicule ; et
    en réponse à la détermination du fait que le deuxième emplacement actuel ne correspond pas au deuxième emplacement cible, le réacheminement du deuxième message de demande de contenu multimédia vers le deuxième véhicule.
  6. Le procédé de la revendication 5, dans lequel le deuxième message de demande de contenu multimédia indique un troisième emplacement du véhicule source qui demande le contenu multimédia, et le procédé comprenant en outre :
    la détermination du fait que le deuxième emplacement cible est plus proche du deuxième emplacement actuel du premier véhicule que du troisième emplacement du véhicule source, et dans lequel le deuxième message de demande de contenu multimédia est réacheminé en réponse à la détermination du fait que le deuxième emplacement cible est plus proche du deuxième emplacement actuel du premier véhicule que du troisième emplacement du véhicule source.
  7. Le procédé de n'importe quelle revendication précédente, comprenant en outre : la transmission, par le premier véhicule, d'un message de réponse à contenu multimédia, le message de réponse à contenu multimédia comprenant des informations de contact du deuxième véhicule.
  8. Un premier véhicule (122), comprenant :
    au moins un processeur matériel (106) ; et
    au moins un support de stockage lisible par ordinateur (114) raccordé à l'au moins un processeur matériel et stockant des instructions de programmation destinées à être exécutées par l'au moins un processeur matériel, les instructions de programmation,
    lorsqu'elles sont exécutées, amenant l'au moins un processeur matériel à effectuer des opérations comprenant :
    la réception, au niveau du premier véhicule, d'un message de demande de contenu multimédia en provenance d'un véhicule source (120), le message de demande de contenu multimédia indiquant un emplacement cible au niveau duquel un contenu multimédia est demandé ;
    la détermination (215) d'un emplacement actuel du premier véhicule ;
    en réponse à la détermination (220) du fait que l'emplacement actuel correspond à l'emplacement cible, la génération (230) du contenu multimédia de l'emplacement cible et la transmission, par le premier véhicule, du contenu multimédia (235) au véhicule source ; caractérisé par :
    en réponse à la détermination (220) du fait que l'emplacement actuel ne correspond pas à l'emplacement cible, la détermination (240), par le premier véhicule, quant à savoir s'il faut réacheminer le message de demande de contenu multimédia vers un deuxième véhicule (124) ; et
    en réponse à la détermination du fait qu'il faut réacheminer le message de demande de contenu multimédia vers le deuxième véhicule (124), le réacheminement (245) du message de demande de contenu multimédia vers le deuxième véhicule.
  9. Le premier véhicule de la revendication 8, dans lequel le contenu multimédia comprend au moins un élément parmi une photo ou une vidéo.
  10. Le premier véhicule de la revendication 8 ou de la revendication 9, dans lequel le message de demande de contenu multimédia comprend des informations de contact du véhicule source qui demande le contenu multimédia, et le contenu multimédia est transmis au véhicule source à l'aide des informations de contact du véhicule source.
  11. Le premier véhicule de la revendication 10, dans lequel le message de demande de contenu multimédia est transmis à l'aide d'un message V2V, véhicule à véhicule, et le contenu multimédia est transmis à l'aide d'un message MMS, service de messagerie multimédia.
  12. Le premier véhicule de n'importe laquelle des revendications 8 à 11, les opérations comprenant en outre :
    la réception, au niveau du premier véhicule, d'un deuxième message de demande de contenu multimédia, le deuxième message de demande de contenu multimédia indiquant un deuxième emplacement cible au niveau duquel un deuxième contenu multimédia est demandé ;
    la détermination d'un deuxième emplacement actuel du premier véhicule ; et
    en réponse à la détermination du fait que le deuxième emplacement actuel ne correspond pas au deuxième emplacement cible, le réacheminement du deuxième message de demande de contenu multimédia vers le deuxième véhicule.
  13. Le premier véhicule de la revendication 12, dans lequel le deuxième message de demande de contenu multimédia indique un troisième emplacement du véhicule source qui demande le contenu multimédia, et les opérations comprennent en outre :
    la détermination du fait que le deuxième emplacement cible est plus proche du deuxième emplacement actuel du premier véhicule que du troisième emplacement du véhicule source, et dans lequel le deuxième message de demande de contenu multimédia est réacheminé vers le deuxième véhicule en réponse à la détermination du fait que le deuxième emplacement cible est plus proche du deuxième emplacement actuel du premier véhicule que du troisième emplacement du véhicule source.
  14. Le premier véhicule de n'importe laquelle des revendications 8 à 13, les opérations comprenant en outre : la transmission, par le premier véhicule, d'un message de réponse à contenu multimédia, le message de réponse à contenu multimédia comprenant des informations de contact du deuxième véhicule.
  15. Un programme d'ordinateur qui, lorsqu'il est exécuté sur un dispositif informatique, amène le dispositif informatique à effectuer le procédé de n'importe laquelle des revendications 1 à 7.
EP19167176.7A 2018-06-19 2019-04-03 Fourniture de communications de données entre véhicules pour contenu multimédia Active EP3585029B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/012,276 US10856120B2 (en) 2018-06-19 2018-06-19 Providing inter-vehicle data communications for multimedia content

Publications (2)

Publication Number Publication Date
EP3585029A1 EP3585029A1 (fr) 2019-12-25
EP3585029B1 true EP3585029B1 (fr) 2022-03-02

Family

ID=66323651

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19167176.7A Active EP3585029B1 (fr) 2018-06-19 2019-04-03 Fourniture de communications de données entre véhicules pour contenu multimédia

Country Status (2)

Country Link
US (1) US10856120B2 (fr)
EP (1) EP3585029B1 (fr)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10892858B2 (en) 2018-09-28 2021-01-12 At&T Intellectual Property I, L.P. Chain broadcasting in vehicle-to-everything (V2X) communications
US10916136B2 (en) * 2018-12-04 2021-02-09 Alpine Electronics, Inc. Geo-tagged vehicle-to-vehicle communication system
US11570625B2 (en) * 2019-03-25 2023-01-31 Micron Technology, Inc. Secure vehicle communications architecture for improved blind spot and driving distance detection
US11064030B2 (en) * 2019-10-17 2021-07-13 Cisco Technology, Inc. Automatic on-boarding agent for IOT edge routers in connected vehicles
US20210261247A1 (en) * 2020-02-26 2021-08-26 Nxp B.V. Systems and methodology for voice and/or gesture communication with device having v2x capability
US10972958B1 (en) * 2020-03-05 2021-04-06 At&T Intellectual Property I, L.P. Location-based route management for vehicle-to-everything relay communications
JP7298526B2 (ja) * 2020-03-24 2023-06-27 トヨタ自動車株式会社 情報処理装置、プログラム、及び、方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100880115B1 (ko) 2008-04-21 2009-01-23 주식회사 에지텍 차량으로부터 독립된 차량용 후방 및 측방영상 무선송수신장치 및 그 방법
US9140782B2 (en) 2012-07-23 2015-09-22 Google Technology Holdings LLC Inter-vehicle alert system with nagable video look ahead
US20170017734A1 (en) 2015-07-15 2017-01-19 Ford Global Technologies, Llc Crowdsourced Event Reporting and Reconstruction
US10952054B2 (en) 2015-10-09 2021-03-16 Ford Global Technologies, Llc Vehicle based content sharing
KR102464898B1 (ko) * 2016-01-05 2022-11-09 삼성전자주식회사 차량과 관련된 영상 정보를 공유하는 방법 및 장치

Also Published As

Publication number Publication date
EP3585029A1 (fr) 2019-12-25
US10856120B2 (en) 2020-12-01
US20190387378A1 (en) 2019-12-19

Similar Documents

Publication Publication Date Title
EP3585029B1 (fr) Fourniture de communications de données entre véhicules pour contenu multimédia
US11375352B2 (en) Devices and methods for updating maps in autonomous driving systems in bandwidth constrained networks
KR102243244B1 (ko) 자율주행시스템에서 긴급단계에 따른 제어방법 및 이를 위한 장치
US11120693B2 (en) Providing inter-vehicle data communications for vehicular drafting operations
KR102195939B1 (ko) 자율주행 차량의 배터리 충전 방법 및 이를 위한 장치
CN107305740B (zh) 路况预警方法、设备、服务器、控制设备及操作系统
US10171953B2 (en) Vehicle event notification via cell broadcast
KR102164188B1 (ko) 자율 주행 시스템에서 차량을 제어하기 위한 방법 및 장치
KR20190106847A (ko) 자율주행시스템에서 차량의 오류 판단방법 및 이를 위한 장치
KR101763604B1 (ko) 무선 통신 시스템에서 위치 기반으로 주변 차량의 멀티미디어 데이터를 수신하는 방법 및 그 장치
US20210132604A1 (en) Autonomous passenger vehicle system
KR20190116192A (ko) 자율주행 차량 해킹 대응 방법 및 그 장치
KR20190103089A (ko) 자율주행시스템에서 응급차량을 위한 주차차량을 이동시키는 방법 및 이를 위한 장치
KR20190107277A (ko) 자율 주행 시스템에서 차량을 제어하는 방법 및 장치
EP3531331A1 (fr) Fourniture de communications de données sécurisées entre véhicules
KR20210106688A (ko) 지능적인 빔 추적 방법 및 이를 위한 자율 주행 차량
KR102203475B1 (ko) 자율 주행 시스템에서 차량을 제어하기 위한 방법 및 장치
KR20190098092A (ko) 자율주행 시스템에서 해킹 차량 관리 방법 및 그 장치
EP3756372B1 (fr) Fourniture de données de capteur sécurisées à des machines automatisées
US11568741B2 (en) Communication device, control method thereof, and communication system including the same
KR20190102145A (ko) 자율주행시스템에서 원격주행을 위한 센싱정보 전송방법 및 이를 위한 장치
EP4156729A1 (fr) Calcul de carte de grille d'occupation, détection complémentaire v2x et coordination de transmission de données de perception coopérative dans des réseaux sans fil
US11180115B2 (en) Controlling vehicle operations based on vehicle information
US20210331678A1 (en) Method of providing vehicle refuge information in disaster situation and apparatus therefor
KR20210082321A (ko) 인공지능형 모빌리티 디바이스 제어 방법 및 인공지능형 모빌리티를 제어하는 지능형 컴퓨팅 디바이스

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200625

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210401

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

INTC Intention to grant announced (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210921

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 1473196

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220315

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602019012000

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20220302

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220602

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220602

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1473196

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220302

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220603

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220704

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220702

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602019012000

Country of ref document: DE

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220430

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220403

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220430

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220430

26N No opposition filed

Effective date: 20221205

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220403

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230518

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230425

Year of fee payment: 5

Ref country code: DE

Payment date: 20230427

Year of fee payment: 5

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230427

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220302