US20230242099A1 - Method for Vehicle Driving Assistance within Delimited Area - Google Patents

Method for Vehicle Driving Assistance within Delimited Area Download PDF

Info

Publication number
US20230242099A1
US20230242099A1 US18/160,553 US202318160553A US2023242099A1 US 20230242099 A1 US20230242099 A1 US 20230242099A1 US 202318160553 A US202318160553 A US 202318160553A US 2023242099 A1 US2023242099 A1 US 2023242099A1
Authority
US
United States
Prior art keywords
vehicle
vehicles
computer
sensing data
delimited area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/160,553
Inventor
Ahmad Pishehvari
Dennis Vollbracht
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aptiv Technologies AG
Original Assignee
Aptiv Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aptiv Technologies Ltd filed Critical Aptiv Technologies Ltd
Assigned to APTIV TECHNOLOGIES LIMITED reassignment APTIV TECHNOLOGIES LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VOLLBRACHT, DENNIS
Publication of US20230242099A1 publication Critical patent/US20230242099A1/en
Assigned to APTIV TECHNOLOGIES (2) S.À R.L. reassignment APTIV TECHNOLOGIES (2) S.À R.L. ENTITY CONVERSION Assignors: APTIV TECHNOLOGIES LIMITED
Assigned to APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L. reassignment APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L. MERGER Assignors: APTIV TECHNOLOGIES (2) S.À R.L.
Assigned to Aptiv Technologies AG reassignment Aptiv Technologies AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: APTIV MANUFACTURING MANAGEMENT SERVICES S.À R.L.
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/46Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]

Definitions

  • An autonomous or automated vehicle must be able to perform different tasks including self-localization, environment modelling, mapping and tracking of external objects such as pedestrians, other vehicles, and bikes, whatever the circumstances.
  • a known method of self-localization uses a satellite navigation system such as GNSS (Global Navigation Satellite System). It allows the vehicle provided with a receiver for receiving the GNSS signals to determine its location (longitude, latitude, and altitude/elevation) to high precision (within a few centimeters to meters) using time signals transmitted along a line of sight by radio from satellites.
  • GNSS Global Navigation Satellite System
  • the signals from the satellite navigation system are generally not available and the self-localization must be performed in a different way.
  • successive observations are collected by one or more onboard sensors, such as a LiDAR or a camera, registered, and the motion of the vehicle is derived from these successive observations.
  • onboard sensors such as a LiDAR or a camera
  • a disadvantage of the first and second methods is that they perform a relative positioning, which requires to know an initial position and orientation of the vehicle.
  • the method of KR20190121275 A is limited to the positioning of one vehicle in the indoor space based on a pre-generated indoor map.
  • the known methods are limited to the self-localization of one vehicle in an indoor environment. Therefore, there is a need for improving the vehicle automated driving and/or the vehicle driving assistance within an indoor environment that other participants, like other vehicles and/or pedestrians, can enter.
  • the step of centrally processing and fusing may include generating a fused map of the delimited area based on the sensing data received from the plurality of vehicles, and the step of transmitting an information for vehicle driving assistance includes transmitting the generated map.
  • the fused map may aggregate the environmental information perceived by the sensing systems of all the vehicles and may be shared with all vehicles, in real time. In this way, each vehicle can map the environment within the delimited area as perceived by the sensors of all vehicles.
  • FIG. 1 illustrates a distributed system for vehicle driving assistance in an delimited area, according to various embodiment.
  • FIGS. 2 A and 2 B illustrate a flowchart of a computer-implemented method for vehicle driving assistance of a target vehicle within delimited area according to various embodiments.
  • the present disclosure relates to a methods and systems for vehicle driving assistance within a delimited area 100 , including those described in the claims. Embodiments are given in the claims, the description, and the drawings.
  • the delimited area 100 may include an indoor space such as a road tunnel as illustrated in FIG. 1 , a car park, or a warehouse where automated vehicles drive to perform logistics tasks.
  • the delimited area may include any other zone requiring improved vehicle driving assistance, in particular for security reason.
  • the delimited area may include a road bridge.
  • a plurality of vehicles may enter and move within the delimited area 100 .
  • Other participants such as pedestrians, may also enter and move within the delimited area 100 .
  • the delimited area 100 is equipped with a central data processing system 200 and is covered by a wireless local area network 300 , to which vehicles V i can connect.
  • the delimited area 100 may be termed as a sensing area, as it is an area in which vehicles send, or upload, sensing data captured by vehicle onboard sensors to the central data processing system 200 through the wireless local area network 300 , as will be described later in more details.
  • the covering area 301 of the base station(s) may extend beyond the delimited area 100 to allow vehicles that are outside the delimited area 100 and move towards an access to the delimited area 100 to connect to the wireless local area network 300 before entering the delimited area 100 .
  • the wireless local area network 300 could be a part of a mobile telecommunication network, for example to one or more cells of the mobile telecommunication network.
  • the vehicles V i may be automated vehicles or vehicles comprising one or more ADAS (Advanced Driving Assistance System) systems.
  • Each vehicle V i has one or more onboard sensors or sensing systems, such as radar(s), LiDAR(s), and/or camera(s), for collecting sensing data related to its own vehicle environment.
  • each vehicle V i comprises a radio communication module for connecting to the the wireless local area network 300 and communicating through it.
  • the radio communication module may include a radio transmitter and a radio receiver. It can comprise hardware means and/or software means.
  • the radio communication module allows the vehicle V i to connect to the wireless local area network 300 , transmit the sensing data collected by the sensors of the vehicle V i to the central data processing system 200 and receive from the central data processing system 200 an information for vehicle driving assistance.
  • Each vehicle may also include a registration module for registering with the central data processing system 200 upon detection and/or connection to the wireless local area network 300 .
  • FIG. 2 A illustrates the computer-implemented method for vehicle driving assistance within the delimited area 100 , according to an embodiment.
  • a registration step S 0 may be executed by each vehicle V i entering the delimited area 100 , for example upon detection and connection to the wireless local area network 300 .
  • the vehicles V i may be deleted from the database for example when they are no longer connected to the wireless local area network 300 .
  • the sensing data collected, or captured, by the sensors in each vehicle V i is continuously transmitted from the vehicle V i to the central data processing system 200 in real time, through the wireless local area network 300 , in a step S 1 .
  • the transmission of the data collected by the onboard sensors of each vehicle V i may be performed automatically, without any action from a user.
  • the sensing data may be transferred to the central data processing system in real time, as soon as it is collected.
  • the sensing data may be raw, unprocessed data from the sensor(s).
  • the central data processing system 200 receives from the plurality of vehicles V i located within the delimited area 100 and connected to the wireless local area network 300 , the sensing data related to the environment around each vehicle V i .
  • Each vehicle V i perceives its surrounding environment with its own onboard sensors.
  • the output of the sensors termed as the sensing data, may be transferred to the central data processing system 200 , through the wireless local area network 300 , by an onboard radio transmitter of the vehicle V i .
  • the sensing data may include point clouds for radars and LiDARs, and picture frames for cameras.
  • the sensing systems of the vehicles V i that may include cameras, radars, LiDARs, do not need to have additional means for processing the sensing data. They can be basic and low-cost sensing systems having only the functions of sensing data and transferring the data.
  • the central data processing system 200 centrally processes and fuses, or aggregates, the sensing data received from the plurality of vehicles V i to determine a fused, or aggregated, environmental information of the environment within the delimited or sensing area 100 .
  • the sensor information from all the vehicles V i located within the delimited area 100 is fused or aggregated by the central processing system 200 . It is as if the central processing system 200 was equipped with the onboard sensors of all the vehicles V i located in the delimited area 100 .
  • the fused or aggregated environmental information may be shared with all the vehicles V i within the delimited area 100 through the wireless network 300 . In this way, each vehicle V i is virtually equipped with the sensors of all the vehicles V i located in the delimited area 100 .
  • the central data processing system 200 may have access to a predefined map of the delimited area 100 .
  • This predefined map may be stored in a memory of the central processing system 200 , and/or obtained from an online database of a service provider like OpenStreetMap® or Google Maps®.
  • the central data processing system 200 may process and analyze the sensor data received from all the vehicles Vi to obtain additional information related to the environment within the delimited area 100 , and update the predefined map by using this additional information.
  • the update of the predefined map may allow to add or delete objects or participants in the predefined map, and/or add information that may be useful for vehicle driving assistance in the delimited area 100 . For example, one or more pedestrians, an object that has fallen to the ground, a traffic jam, a hazardous event such as a stopped vehicle or a collision, etc. may be added to the predefined map.
  • the central processing system 200 may generate an individual map for each vehicle V i , representing an area around the vehicle V i , by using the sensor data from said vehicle V i , and then fuse, or aggregate, the plurality of individual maps generated for the plurality of vehicles V i to generate a fused map of the delimited area 100 .
  • the central data processing system 200 may determine, or identify, a hazardous situation in the delimited area 100 , in a step S 31 .
  • a hazardous situation For example, an accident has occurred in the delimited area 100 , a vehicle is stopped on a road lane in a tunnel, or an object has fallen down to the ground within the delimited area 100 .
  • the hazardous situation may be identified by processing and analyzing the sensing data received from the vehicles V i .
  • a stationary traffic can be detected by analyzing the speed data from the vehicles V i
  • an object on the ground or pedestrians can be detected by image analysis of images captured by sensors of vehicles V i .
  • a collision may be detected by analyzing images captured by vehicles, etc.
  • the central data processing system 200 generates a warning message or a warning information to inform the vehicles V i in the delimited area 100 , in a step S 32 .
  • the central data processing system 200 may generate commands to control a driving or parking function of one or more target vehicles V i in the delimited area 100 , based on a result of centrally processing and fusing the sensor data of all the vehicles V i .
  • the commands may control a target vehicle to adapt its speed in a tunnel, or take over the driving of a target vehicle to a chosen parking lot in a car park.
  • the commands may control the movements of the automated vehicles in the warehouse.
  • the target vehicles may include one or more vehicle(s) located within the delimited area 100 , and/or one or more vehicle(s) located outside the delimited area 100 but within the covering area 301 of the wireless local area network 300 .
  • one or more target vehicles could be vehicles located outside the delimited area 100 and moving towards an access to the delimited area 100 to enter it.
  • the central data processing system 200 may transmit a current information for vehicle driving assistance, for example the current fused map, and/or a current warning message, and/or commands for driving or parking, to the newly registered vehicle.
  • a current information for vehicle driving assistance for example the current fused map, and/or a current warning message, and/or commands for driving or parking
  • the information for vehicle driving assistance is continuously updated, in real time, based on the sensing data received over time by the central data processing system 200 .
  • the fused environmental information resulting from the step of centrally processing and fusing the sensing data from all vehicles V i may also be used by the central data processing system 200 to perform the tasks of driving assistance or autonomous driving for the considered vehicle V i .
  • the result of each task of driving assistance or autonomous driving is transferred from the central data processing system 200 to the vehicle V i through the wireless network 300 , in a step S 6 .
  • the central data processing system 200 may include a network interface 210 for connecting to the wireless local area network 300 , a reception module 220 , a data processing module 230 and a transmission module 240 .
  • the transmitted information may include the fused environmental information, and/or an information generated based on the fused environment information, for example one or more commands for controlling the target vehicle Vt to drive autonomously, or a warning message.
  • the central data processing system 200 may further include a vehicle database 250 for storing information on the vehicles located within the delimited area, that have registered with the central data processing system 200 , a registration module 260 configured to perform the task of registration of vehicles located within the delimited area 100 , and a database management module 270 responsible for storage, retrieval and update of information in the database 250 .
  • the registration module 230 is responsible for registering in the database 260 .
  • a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members.
  • “at least one of a, b, or c” can cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, c-c-c, or any other ordering of a, b, and c).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure relates to the field of vehicle driving assistance and/or autonomous driving within a delimited area, such as an indoor space. The disclosure includes a computer-implemented method for vehicle driving assistance within a delimited area, which includes the followings steps performed by a central data processing system: receiving from a plurality of vehicles (Vi, i=1,2,3,...) located within the delimited area, through a wireless local area network covering said delimited area, sensing data from onboard sensors of the vehicles (Vi), whereby the sensing data from each vehicle (Vi) is related to an environment around said vehicle (Vi), centrally processing and fusing the sensing data from the plurality of vehicles (Vi); and transmitting to at least one target vehicle (Vi), through the wireless local area network, an information for vehicle driving assistance based on a result of the step of centrally processing and fusing.

Description

    INCORPORATION BY REFERENCE
  • This application claims priority to European Patent Application Number EP22153966.1, filed Jan. 28, 2022, the disclosure of which is incorporated by reference in its entirety.
  • BACKGROUND
  • An autonomous or automated vehicle must be able to perform different tasks including self-localization, environment modelling, mapping and tracking of external objects such as pedestrians, other vehicles, and bikes, whatever the circumstances.
  • Self-localization may be the most important task of an autonomous vehicle. The more accurate the self-localization of the autonomous vehicle, the more accurate the localization and tracking of external objects. Self-localization of the autonomous vehicle can be performed based on data collected by different onboard sensing systems such as cameras, LiDARs, radars, etc.
  • Furthermore, the autonomous vehicle has to generate an environment model to recognize the static objects, for example a wall, in its surrounding environment, avoid collisions with the static objects, and plan an optimal path from a point A to a point B.
  • A known method of self-localization uses a satellite navigation system such as GNSS (Global Navigation Satellite System). It allows the vehicle provided with a receiver for receiving the GNSS signals to determine its location (longitude, latitude, and altitude/elevation) to high precision (within a few centimeters to meters) using time signals transmitted along a line of sight by radio from satellites. However, within an indoor environment, e.g., in a tunnel or a parking garage, the signals from the satellite navigation system are generally not available and the self-localization must be performed in a different way.
  • Different methods are known for self-localization of an autonomous vehicle in an indoor environment.
  • A first known method consists in tracking the position and orientation of the vehicle using mechanical motion sensors, such as accelerators and gyroscopes, and a motion model.
  • In a second known method, successive observations are collected by one or more onboard sensors, such as a LiDAR or a camera, registered, and the motion of the vehicle is derived from these successive observations.
  • The document KR20190121275 A discloses an indoor positioning method for positioning a vehicle within an indoor space, based on a machine learning and/or artificial intelligence algorithm. The vehicle acquires image information of the indoor space with a sensing system, and the acquired information is matched with a pre-generated indoor map to localize the vehicle in the indoor space based on the matching result.
  • A disadvantage of the first and second methods is that they perform a relative positioning, which requires to know an initial position and orientation of the vehicle. The method of KR20190121275 A is limited to the positioning of one vehicle in the indoor space based on a pre-generated indoor map.
  • The known methods are limited to the self-localization of one vehicle in an indoor environment. Therefore, there is a need for improving the vehicle automated driving and/or the vehicle driving assistance within an indoor environment that other participants, like other vehicles and/or pedestrians, can enter.
  • SUMMARY
  • The present disclosure relates to the field of vehicle driving assistance and/or autonomous driving within a delimited area, such as an indoor space.
  • The present disclosure includes a computer-implemented method for vehicle driving assistance within a delimited area, including the followings steps performed by a central data processing system:
    • receiving from a plurality of vehicles located within the delimited area, through a wireless local area network, sensing data from onboard sensors of the vehicles, whereby the sensing data from each vehicle is related to an environment around said vehicle,;
    • centrally processing and fusing the sensing data from the plurality of vehicles;
    • transmitting to at least one target vehicle of the plurality of vehicles, through the wireless local area network, an information for vehicle driving assistance based on a result of the step of centrally processing and fusing.
  • The present method originates from a need for improving vehicle autonomous driving and/or vehicle driving assistance within an indoor environment. However, the present method applies more generally to a delimited area that may include a tunnel, a car park, a warehouse, a bridge, or any other type of zone requiring an improved vehicle driving assistance, for example for security reason. The present method allows to obtain a centralized perception of the delimited area by sharing the sensing systems or sensors of all vehicles located in the delimited area. It is as if the central data processing system was equipped with the sensors of all vehicles. Furthermore, the processing of the sensing data collected by the sensors is carried out by the central data processing system, which allows to have basic and low-cost sensors without processing means in the vehicles. As a result, the implementation is simple and low-cost.
  • In an embodiment, the step of centrally processing and fusing may include generating a fused map of the delimited area based on the sensing data received from the plurality of vehicles, and the step of transmitting an information for vehicle driving assistance includes transmitting the generated map.
  • The fused map may aggregate the environmental information perceived by the sensing systems of all the vehicles and may be shared with all vehicles, in real time. In this way, each vehicle can map the environment within the delimited area as perceived by the sensors of all vehicles.
  • Other features of the present disclosure are defined in the appended claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Other features, purposes, and aspects of the disclosure will become more explicit by means of reading the detailed statement of the non-restrictive example embodiments made with reference to the accompanying drawings.
  • FIG. 1 illustrates a distributed system for vehicle driving assistance in an delimited area, according to various embodiment.
  • FIGS. 2A and 2B illustrate a flowchart of a computer-implemented method for vehicle driving assistance of a target vehicle within delimited area according to various embodiments.
  • FIG. 3 illustrates a schematic block diagram of a central data processing system, according to an embodiment according to various embodiments.
  • DETAILED DESCRIPTION
  • The present disclosure relates to a methods and systems for vehicle driving assistance within a delimited area 100, including those described in the claims. Embodiments are given in the claims, the description, and the drawings.
  • For example, the delimited area 100 may include an indoor space such as a road tunnel as illustrated in FIG. 1 , a car park, or a warehouse where automated vehicles drive to perform logistics tasks. The delimited area may include any other zone requiring improved vehicle driving assistance, in particular for security reason. For example, the delimited area may include a road bridge. These examples are only illustrative and non-limitative.
  • A plurality of vehicles, referenced as Vi with i = 1, 2, 3, ..., may enter and move within the delimited area 100. Other participants, such as pedestrians, may also enter and move within the delimited area 100.
  • The delimited area 100 is equipped with a central data processing system 200 and is covered by a wireless local area network 300, to which vehicles Vi can connect.
  • The delimited area 100 may be termed as a sensing area, as it is an area in which vehicles send, or upload, sensing data captured by vehicle onboard sensors to the central data processing system 200 through the wireless local area network 300, as will be described later in more details.
  • In an embodiment, the wireless local area network 300 may a 5G-based network using the 5G technology. The 5G technology offers an extremely low latency rate that is the delay between the sending and receiving information. Alternatively, the wireless local area network 300 could be based on any other technology for wireless communication offering a low latency rate, for example 6G and any future generation communication system (e.g., a next generation communication system). The local area network 300 has a network identifier, or network ID, that may be broadcast by one or more base stations within an area covering the delimited area 100. The covering area 301 of the base station(s) may extend beyond the delimited area 100 to allow vehicles that are outside the delimited area 100 and move towards an access to the delimited area 100 to connect to the wireless local area network 300 before entering the delimited area 100. The wireless local area network 300 could be a part of a mobile telecommunication network, for example to one or more cells of the mobile telecommunication network.
  • The vehicles Vi, with i = 1, 2, 3, ..., may be automated vehicles or vehicles comprising one or more ADAS (Advanced Driving Assistance System) systems. Each vehicle Vi has one or more onboard sensors or sensing systems, such as radar(s), LiDAR(s), and/or camera(s), for collecting sensing data related to its own vehicle environment. Furthermore, each vehicle Vi comprises a radio communication module for connecting to the the wireless local area network 300 and communicating through it. The radio communication module may include a radio transmitter and a radio receiver. It can comprise hardware means and/or software means. The radio communication module allows the vehicle Vi to connect to the wireless local area network 300, transmit the sensing data collected by the sensors of the vehicle Vi to the central data processing system 200 and receive from the central data processing system 200 an information for vehicle driving assistance. Each vehicle may also include a registration module for registering with the central data processing system 200 upon detection and/or connection to the wireless local area network 300.
  • FIG. 2A illustrates the computer-implemented method for vehicle driving assistance within the delimited area 100, according to an embodiment.
  • At a first point in time, referenced as t0, a plurality of vehicles Vi with i = 1, 2, 3, ..., are located within the delimited area 100 and connected to the wireless local area network 300. The vehicles Vi can be started vehicles or stopped vehicles, for example parked vehicles or temporarily stopped vehicles. In any case, the vehicles Vi have one or more active sensors operable to capture sensing data and a radio communication module operable to transmit the sensing data captured by the sensors. In an embodiment, each of the vehicles Vi may have previously registered with the central data processing system 200 through the wireless local area network 300, for example when or shortly before entering the delimited area 100. A registration step S0 may be executed by each vehicle Vi entering the delimited area 100, for example upon detection and connection to the wireless local area network 300. The central data processing system 200 may store information on the registered vehicles Vi with i = 1, 2, 3, ... in a database and manage the database. The vehicles Vi may be deleted from the database for example when they are no longer connected to the wireless local area network 300.
  • The sensing data collected, or captured, by the sensors in each vehicle Vi is continuously transmitted from the vehicle Vi to the central data processing system 200 in real time, through the wireless local area network 300, in a step S1. The transmission of the data collected by the onboard sensors of each vehicle Vi may be performed automatically, without any action from a user. The sensing data may be transferred to the central data processing system in real time, as soon as it is collected. The sensing data may be raw, unprocessed data from the sensor(s). Thus, the central data processing system 200 receives from the plurality of vehicles Vi located within the delimited area 100 and connected to the wireless local area network 300, the sensing data related to the environment around each vehicle Vi. Each vehicle Vi perceives its surrounding environment with its own onboard sensors. The output of the sensors, termed as the sensing data, may be transferred to the central data processing system 200, through the wireless local area network 300, by an onboard radio transmitter of the vehicle Vi. For example, the sensing data may include point clouds for radars and LiDARs, and picture frames for cameras. The sensing systems of the vehicles Vi, that may include cameras, radars, LiDARs, do not need to have additional means for processing the sensing data. They can be basic and low-cost sensing systems having only the functions of sensing data and transferring the data.
  • In a step S2, the central data processing system 200 receives the sensing data from each vehicle Vi located within the delimited area 100, though the wireless network 300. The sensing data from the vehicles Vi may be received in real time, as soon as it is collected.
  • In a step S3, the central data processing system 200 centrally processes and fuses, or aggregates, the sensing data received from the plurality of vehicles Vi to determine a fused, or aggregated, environmental information of the environment within the delimited or sensing area 100. Thus, the sensor information from all the vehicles Vi located within the delimited area 100 is fused or aggregated by the central processing system 200. It is as if the central processing system 200 was equipped with the onboard sensors of all the vehicles Vi located in the delimited area 100. The fused or aggregated environmental information may be shared with all the vehicles Vi within the delimited area 100 through the wireless network 300. In this way, each vehicle Vi is virtually equipped with the sensors of all the vehicles Vi located in the delimited area 100.
  • FIG. 2B illustrates the step S3, according to an embodiment.
  • In an embodiment, the step S3 of centrally processing and fusing, or aggregating, the sensing data from the plurality of vehicles Vi may include a step S30 of generating a fused, or aggregated, map of the delimited area 100 based on the sensing data received from the plurality of vehicles Vi.
  • The central data processing system 200 may have access to a predefined map of the delimited area 100. This predefined map may be stored in a memory of the central processing system 200, and/or obtained from an online database of a service provider like OpenStreetMap® or Google Maps®. In the step S30, the central data processing system 200 may process and analyze the sensor data received from all the vehicles Vi to obtain additional information related to the environment within the delimited area 100, and update the predefined map by using this additional information. The update of the predefined map may allow to add or delete objects or participants in the predefined map, and/or add information that may be useful for vehicle driving assistance in the delimited area 100. For example, one or more pedestrians, an object that has fallen to the ground, a traffic jam, a hazardous event such as a stopped vehicle or a collision, etc. may be added to the predefined map.
  • In a variant, in the step S30, the central processing system 200 may generate an individual map for each vehicle Vi, representing an area around the vehicle Vi, by using the sensor data from said vehicle Vi, and then fuse, or aggregate, the plurality of individual maps generated for the plurality of vehicles Vi to generate a fused map of the delimited area 100.
  • In an embodiment, by processing and fusing the sensing data from the plurality of vehicles Vi, the central data processing system 200 may determine, or identify, a hazardous situation in the delimited area 100, in a step S31. For example, an accident has occurred in the delimited area 100, a vehicle is stopped on a road lane in a tunnel, or an object has fallen down to the ground within the delimited area 100. The hazardous situation may be identified by processing and analyzing the sensing data received from the vehicles Vi. For example, a stationary traffic can be detected by analyzing the speed data from the vehicles Vi, an object on the ground or pedestrians can be detected by image analysis of images captured by sensors of vehicles Vi., a collision may be detected by analyzing images captured by vehicles, etc. In that case, the central data processing system 200 generates a warning message or a warning information to inform the vehicles Vi in the delimited area 100, in a step S32.
  • In an embodiment, in a step S33, the central data processing system 200 may generate commands to control a driving or parking function of one or more target vehicles Vi in the delimited area 100, based on a result of centrally processing and fusing the sensor data of all the vehicles Vi. For example, the commands may control a target vehicle to adapt its speed in a tunnel, or take over the driving of a target vehicle to a chosen parking lot in a car park. In case of a warehouse comprising automated vehicles such as forklifts, the commands may control the movements of the automated vehicles in the warehouse.
  • In a step S4, the central data processing system 200 transmits a fused information for vehicle driving assistance, based on a result of the step S3 of centrally processing and fusing the sensing data from the vehicles Vi, to one or more target vehicles Vt of the vehicles Vi with i = 1, 2, 3, ... connected to the wireless local area network 300. The target vehicles may include one or more vehicle(s) located within the delimited area 100, and/or one or more vehicle(s) located outside the delimited area 100 but within the covering area 301 of the wireless local area network 300. For example, one or more target vehicles could be vehicles located outside the delimited area 100 and moving towards an access to the delimited area 100 to enter it.
  • The step S4 may include the following actions performed by the central data processing system 200:
    • in a step S40, transmitting to the target vehicle(s) the fused map generated in the step S30 by using the sensing data received from the plurality of vehicles Vi; and/or
    • in a step S41, transmitting to the target vehicle(s) the warning message related to an hazardous situation generated in the step S32; and/or
    • in a step S42, transmitting to the target vehicle(s) one or more commands generated in the step S33 to control a driving of the target vehicle(s).
  • The central data processing system 200 continuously receives the sensing data collected by the sensors of all the vehicles Vi with i = 1, 2, 3, ... located within the delimited area 100, processes and fuses the received sensing data to update the information for vehicle driving assistance transmitted to the target vehicles.
  • Furthermore, when a new vehicle enters the delimited area 100 and registers with the central data processing system 200, the central data processing system 200 may transmit a current information for vehicle driving assistance, for example the current fused map, and/or a current warning message, and/or commands for driving or parking, to the newly registered vehicle.
  • The information for vehicle driving assistance is continuously updated, in real time, based on the sensing data received over time by the central data processing system 200.
  • Optionally, in a step S5, the central data processing system 200 executes, for each vehicle Vi with i = 1, 2, 3, ... located within the delimited area 100, one or more tasks of driving assistance or autonomous driving including self-localization of the vehicle Vi, mapping around the vehicle Vi, tracking external objects around the vehicle Vi, path planning for the vehicle Vi and controlling the vehicle Vi for example for driving and/or parking, by processing the sensing data collected by the sensors of the vehicle Vi and transferred to the central data processing system 200 through the wireless network 300. The fused environmental information resulting from the step of centrally processing and fusing the sensing data from all vehicles Vi may also be used by the central data processing system 200 to perform the tasks of driving assistance or autonomous driving for the considered vehicle Vi. The result of each task of driving assistance or autonomous driving is transferred from the central data processing system 200 to the vehicle Vi through the wireless network 300, in a step S6. In this way, the processing of the sensing data collected by each vehicle V with i = 1, 2, 3, ... located within the delimited area 100 to execute tasks of driving assistance and/or autonomous driving is deported into the central data processing system 200. The low latency of the network 300 allows to deport the processing of the sensing data collected by each vehicle Vi with i = 1, 2, 3, ... located within the delimited area 100, while allowing this vehicle Vi to use the result of the processing in real time, or almost in real time.
  • For the task of self-localization of the vehicle Vi, the central data processing system 200 may match a landmark information detected in the environment around said vehicle Vi, based on the sensing data received from said vehicle Vi, and a predefined map of the delimited area 100 including eh landmarks. For example, the delimited area 100 may comprise reflecting elements at predetermined positions, and the predefined map may include the reflecting elements. As a result, the central data processing system can match the landmark information obtained by processing the collected sensing data and the landmarks included in the predefined map to precisely localize the vehicle.
  • The central data processing system 200 includes means for carrying out the steps of the method previously described. It is configured to receive the sensing data from the vehicles Vi, with i = 1, 2, 3, ..., located within the delimited area 100, through the wireless local area network 300, to centrally process and fuse the received sensor data, transmit an information for vehicle driving assistance, based on a result of the processing and fusion of the received sensing data, to target vehicles.
  • In an example embodiment illustrated in FIG. 3 , the central data processing system 200 may include a network interface 210 for connecting to the wireless local area network 300, a reception module 220, a data processing module 230 and a transmission module 240.
  • The reception module 220 is configured to receive from each vehicle Vi, with i = 1, 2, 3, ..., sensing data related to an environment of the vehicle Vi, through the network 300.
  • The data processing module 230 is responsible for centrally processing and fusing or aggregating the sensing data from the plurality of vehicles Vi, with i = 1, 2, 3, ..., and for generating a fused information for vehicle driving assistance based on a result of the processing and fusion of the sensing data. Optionally, the data processing module 230 may be configured to execute, for each vehicle Vi with i = 1, 2, 3, ... located within the delimited area 100, tasks of driving assistance and/or autonomous driving such as self-localization of the vehicle Vi, mapping around the vehicle Vi, tracking external objects around the vehicle Vi, and controlling the vehicle Vi, by processing the sensing data collected by the sensors of the vehicle Vi.
  • The transmission module 240 is configured to transmit the information for vehicle driving assistance generated based on processing and fusing the sensing data from the plurality of vehicles Vi, with i = 1, 2, 3, ..., to target vehicles in the delimited area 100, through the wireless network 300. The transmitted information may include the fused environmental information, and/or an information generated based on the fused environment information, for example one or more commands for controlling the target vehicle Vt to drive autonomously, or a warning message.
  • Optionally, the transmission module 240 is configured to transfer to a vehicle Vi the result of one or more tasks of driving assistance and/or autonomous driving executed by processing the sensing data from said vehicle Vi, as previously explained.
  • The central data processing system 200 may further include a vehicle database 250 for storing information on the vehicles located within the delimited area, that have registered with the central data processing system 200, a registration module 260 configured to perform the task of registration of vehicles located within the delimited area 100, and a database management module 270 responsible for storage, retrieval and update of information in the database 250. The registration module 230 is responsible for registering in the database 260.
  • The database 260 stores information on each registered vehicle located within the delimited area 100. The central data processing system 200 may have a database management module 270 for storage, retrieval and update of information in the database 260.
  • Aspects include a central data processing system for vehicle driving assistance within a delimited area, including means for carrying out the steps of a method described herein. Aspects also include a system including a central data processing system for vehicle driving assistance within an delimited area covered by a wireless local area network, and a plurality of vehicles (Vi, i=1,2,3...) located within the delimited area, each vehicle having one or more sensors for collecting sensing data related to an environment around said vehicle, and a communication module for transmitting the collected sensing data to the central data processing system through the wireless local area network. Aspects further include a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of a method described herein.
  • The use of “example,” “advantageous,” and grammatically related terms means “serving as an example, instance, or illustration,” and not “preferred” or “advantageous over other examples.” Items represented in the accompanying figures and terms discussed herein may be indicative of one or more items or terms, and thus reference may be made interchangeably to single or plural forms of the items and terms in this written description. The use herein of the word “or” may be considered use of an “inclusive or,” or a term that permits inclusion or application of one or more items that are linked by the word “or” (e.g., a phrase “A or B” may be interpreted as permitting just “A,” as permitting just “B,” or as permitting both “A” and “B”), unless the context clearly dictates otherwise. Also, as used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. For instance, “at least one of a, b, or c” can cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, c-c-c, or any other ordering of a, b, and c).

Claims (17)

What is claimed is:
1. A computer-implemented method for vehicle driving assistance within a delimited area, the method comprising:
receiving, from a plurality of vehicles located within the delimited area, through a wireless local area network covering said delimited area, sensing data from onboard sensors of the plurality of vehicles, wherein the sensing data from each vehicle is related to an environment around the vehicle;
centrally processing and fusing the sensing data from the plurality of vehicles; and
transmitting, to at least one target vehicle, through the wireless local area network, an information for vehicle driving assistance based on a result of the centrally processing and fusing.
2. The computer-implemented method according to claim 1, wherein centrally processing and fusing further comprises:
generating a fused map of the delimited area based on the sensing data received from the plurality of vehicles.
3. The computer-implemented method according to claim 2, further comprising:
transmitting the generated fused map.
4. The computer-implemented method according to claim 2, wherein the fused map is generated by updating a predefined map with the sensing data received from the plurality of vehicles.
5. The computer-implemented method according to claim 2, wherein the fused map is generated by fusion of a plurality of individual maps, wherein each individual map includes an area around the vehicle that corresponds to a part of the delimited area.
6. The computer-implemented method according to claim 5, wherein each individual map is generated based on the sensing data received from the vehicle.
7. The computer-implemented method according to claim 1, further comprising:
registering a vehicle with a central data processing system, through the wireless local area network, upon detection of the wireless local area network by the vehicle.
8. The computer-implemented method according to claim 1, wherein transmitting the information for vehicle driving assistance further comprises:
transmitting one or more commands to control at least one of a driving function or a parking function of the at least one target vehicle.
9. The computer-implemented method according to claim 1, wherein transmitting the information for vehicle driving assistance further comprises:
transmitting a warning message related to a hazardous situation determined by the centrally processing and fusing the sensing data from the plurality of vehicles.
10. The computer-implemented method according to claim 1, wherein a central data processing system executes, for each of the plurality of vehicles, at least one of an operation including:
self-localization of the vehicle,
mapping around the vehicle,
tracking external objects around the vehicle,
path planning for the vehicle, or
controlling the vehicle, based on the sensing data from the vehicle.
11. The computer-implemented method according to claim 10, wherein the operation of self-localization of the vehicle comprises:
matching a landmark information detected in the environment around the vehicle based on the sensing data received from the vehicle, and a map of the delimited area.
12. The computer-implemented method according to claim 1, wherein the wireless local area network is a network based on at least one of a 5G communication system or a next generation communication system.
13. The computer-implemented method according to claim 1, wherein the delimited area is one of an environment including:
a tunnel,
a parking garage,
a warehouse, or
a bridge.
14. The computer-implemented method according to claim 1, wherein transmitting the information for vehicle driving assistance comprises transmitting to the at least one target vehicle located within the delimited area.
15. The computer-implemented method according to claim 1, wherein transmitting the information for vehicle driving assistance comprises transmitting to the at least one target vehicle located outside the delimited area but within a covering area of the wireless local area network.
16. A central data processing system for vehicle driving assistance within a delimited area, the central data processing system comprising:
a computer program comprising instructions which, when the program is executed by a computer, cause the computer to:
receive, from a plurality of vehicles located within the delimited area, through a wireless local area network covering said delimited area, sensing data from onboard sensors of the plurality of vehicles, wherein the sensing data from each vehicle is related to an environment around the vehicle;
process and fuse the sensing data from the plurality of vehicles; and
transmit, to at least one target vehicle, through the wireless local area network, an information for vehicle driving assistance.
17. A system comprising:
a wireless local area network;
a plurality of vehicles located within a delimited area, each vehicle comprising:
one or more sensors configured to collect sensing data related to an environment around the vehicle; and
a communication module configured to transmit the collected sensing data through the wireless local area network; and
a central data processing system for vehicle driving assistance within the delimited area covered by the wireless local area network, the central data processing system comprising:
a computer program comprising instructions which, when the program is executed by a computer, cause the computer to:
receive, from the plurality of vehicles located within the delimited area the collected sensing data;
process and fuse the collected sensing data from the plurality of vehicles; and
transmit, to at least one target vehicle, through the wireless local area network, an information for vehicle driving assistance.
US18/160,553 2022-01-28 2023-01-27 Method for Vehicle Driving Assistance within Delimited Area Pending US20230242099A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP22153966.1 2022-01-28
EP22153966.1A EP4220580A1 (en) 2022-01-28 2022-01-28 Method for vehicle driving assistance within delimited area

Publications (1)

Publication Number Publication Date
US20230242099A1 true US20230242099A1 (en) 2023-08-03

Family

ID=80122534

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/160,553 Pending US20230242099A1 (en) 2022-01-28 2023-01-27 Method for Vehicle Driving Assistance within Delimited Area

Country Status (3)

Country Link
US (1) US20230242099A1 (en)
EP (1) EP4220580A1 (en)
CN (1) CN116528154A (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021042051A1 (en) * 2019-08-31 2021-03-04 Nvidia Corporation Map creation and localization for autonomous driving applications
KR20190121275A (en) 2019-10-07 2019-10-25 엘지전자 주식회사 System, apparatus and method for indoor positioning
US11288520B2 (en) * 2020-06-25 2022-03-29 Toyota Motor Engineering & Manufacturing N.A. Inc. Systems and methods to aggregate and distribute dynamic information of crowdsourcing vehicles for edge-assisted live map service

Also Published As

Publication number Publication date
CN116528154A (en) 2023-08-01
EP4220580A1 (en) 2023-08-02

Similar Documents

Publication Publication Date Title
US10471955B2 (en) Stop sign and traffic light alert
US9971352B1 (en) Automated co-pilot control for autonomous vehicles
US10349011B2 (en) System and method for improved obstacle awareness in using a V2X communications system
US10328934B2 (en) Temporal data associations for operating autonomous vehicles
US10073456B2 (en) Automated co-pilot control for autonomous vehicles
US10613547B2 (en) System and method for improved obstacle awareness in using a V2X communications system
US10268203B2 (en) Calibration validation for autonomous vehicle operations
US11915440B2 (en) Generation of structured map data from vehicle sensors and camera arrays
CN110945320B (en) Vehicle positioning method and system
US20180341263A1 (en) Methods and systems for moving object velocity determination
US20190051168A1 (en) System and Method for Improved Obstable Awareness in Using a V2x Communications System
US11294387B2 (en) Systems and methods for training a vehicle to autonomously drive a route
US10782384B2 (en) Localization methods and systems for autonomous systems
US11840262B2 (en) Production factory unmanned transfer system and method
CN111497853B (en) System and method for sensor diagnostics
CN114503176B (en) Method for acquiring self position and electronic device
US20230242099A1 (en) Method for Vehicle Driving Assistance within Delimited Area
US11661077B2 (en) Method and system for on-demand roadside AI service
US20230194301A1 (en) High fidelity anchor points for real-time mapping with mobile devices
RU2772620C1 (en) Creation of structured map data with vehicle sensors and camera arrays
US20240219199A1 (en) Non-semantic map layer in crowdsourced maps
EP4357944A1 (en) Identification of unknown traffic objects
US20240135252A1 (en) Lane-assignment for traffic objects on a road
WO2024144948A1 (en) Non-semantic map layer in crowdsourced maps

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: APTIV TECHNOLOGIES LIMITED, BARBADOS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VOLLBRACHT, DENNIS;REEL/FRAME:063655/0127

Effective date: 20221205

AS Assignment

Owner name: APTIV TECHNOLOGIES (2) S.A R.L., LUXEMBOURG

Free format text: ENTITY CONVERSION;ASSIGNOR:APTIV TECHNOLOGIES LIMITED;REEL/FRAME:066746/0001

Effective date: 20230818

Owner name: APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L., LUXEMBOURG

Free format text: MERGER;ASSIGNOR:APTIV TECHNOLOGIES (2) S.A R.L.;REEL/FRAME:066566/0173

Effective date: 20231005

Owner name: APTIV TECHNOLOGIES AG, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L.;REEL/FRAME:066551/0219

Effective date: 20231006