WO2022027159A1 - Systems and methods for constructing high-definition map with its confidence determined based on crowdsourcing - Google Patents

Systems and methods for constructing high-definition map with its confidence determined based on crowdsourcing Download PDF

Info

Publication number
WO2022027159A1
WO2022027159A1 PCT/CN2020/106502 CN2020106502W WO2022027159A1 WO 2022027159 A1 WO2022027159 A1 WO 2022027159A1 CN 2020106502 W CN2020106502 W CN 2020106502W WO 2022027159 A1 WO2022027159 A1 WO 2022027159A1
Authority
WO
WIPO (PCT)
Prior art keywords
confidence
map
sensor data
sensor
aggregated
Prior art date
Application number
PCT/CN2020/106502
Other languages
French (fr)
Inventor
Xiaozhi Qu
Teng MA
Original Assignee
Beijing Voyager Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Voyager Technology Co., Ltd. filed Critical Beijing Voyager Technology Co., Ltd.
Priority to PCT/CN2020/106502 priority Critical patent/WO2022027159A1/en
Publication of WO2022027159A1 publication Critical patent/WO2022027159A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors

Definitions

  • the present disclosure relates to systems and methods for constructing High-definition (HD) maps, and more particularly to, systems and methods for constructing an HD map with its confidence determined based on crowdsourcing.
  • HD High-definition
  • GPS Global Positioning System
  • RTK Real Time Kinematic
  • a typical solution for the above problems is to equip an autonomous driving vehicle with combined sensors.
  • one or more sensors such as a Light Detection and Ranging (LiDAR) radar, a high-resolution camera, a Global Positioning System (GPS) , or an Inertial Measurement Unit (IMU)
  • LiDAR Light Detection and Ranging
  • GPS Global Positioning System
  • IMU Inertial Measurement Unit
  • HD maps can be generated by using GPS/IMU combined sensors to provide absolute pose and position information.
  • a more accurate solution is to further combine the GPS/IMU combined sensor with LiDAR and high-resolution camera using Simultaneous Localization and Mapping (SLAM) method.
  • SLAM Simultaneous Localization and Mapping
  • accuracy of an HD map is usually measured by a “confidence” of the positions of the map elements. Confidence is crucial in positioning an autonomous driving vehicle. Because there is no viable means to determine the true confidence of the HD map, it is typically determined based on the confidence of the sensor signal. For example, the confidence of GPS signal is represented using the Dilution of Precision (DOP) of the signal. However, because the DOP value of a signal depends on the distribution of the observing satellites, DOP value of the same location observed at different time can be different.
  • DOP Dilution of Precision
  • Embodiments of the disclosure address the above problems by methods and systems that provide an improved solution for calculating a confidence of the HD map based on crowdsourcing.
  • Embodiments of the disclosure disclose a method for constructing an HD map based on a confidence.
  • the method may include receiving, by a communication interface, a first sensor data acquired of a target region by a first sensor and a second sensor data acquired of the target region by a second sensor.
  • the method may further include determining, by at least one processor, a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data.
  • the method may also include determining, by the at least one processor, an aggregated position of the map element by weighting the first position and the second position.
  • the method may also include determining, by the at least one processor, an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence.
  • the method may also include constructing, by the at least one processor, the HD map based on the aggregated position and the aggregated confidence.
  • Embodiments of the disclosure further provide a system for constructing an HD map based on a confidence.
  • the system may include a communication interface configured to communicate with a plurality of terminals via a network.
  • the system may further include a storage configured to store the HD map.
  • the system may also include at least one processor.
  • the at least one processor may be configured to determine a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data.
  • the at least one processor may be further configured to determine an aggregated position of the map element by weighting the first position and the second position.
  • the at least one processor may also be configured to determine an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence.
  • the at least one processor may also be configured to construct the HD map based on the aggregated position and the aggregated confidence.
  • Embodiments of the disclosure further disclose a non-transitory computer-readable medium having a computer program stored thereon.
  • the computer program when executed by at least one processor, may perform a method for constructing an HD map based on a confidence.
  • the method may include receiving a first sensor data acquired of a target region by a first sensor and a second sensor data acquired of the target region by a second sensor.
  • the method may further include instructing determining a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data.
  • the method may also include determining an aggregated position of the map element by weighting the first position and the second position.
  • the method may also include determining an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence.
  • the method may also include determining an aggregated confidence associated with the aggregated position by weighting the first confidence and the
  • FIG. 1 illustrates a schematic diagram of an exemplary system for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure.
  • FIG. 2 illustrates a block diagram of an exemplary system for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure.
  • FIG. 3 is a flowchart of an exemplary method for constructing an HD map , according to embodiments of the disclosure.
  • FIG. 1 illustrates a schematic diagram of an exemplary system 100 for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure.
  • system 100 may include a server 140 communicatively connected with a plurality of terminals, including terminals 131 and 132.
  • server 140 may be a local physical server, a cloud server (as illustrated in FIG. 1) , a virtual server, a distributed server, or any other suitable computing device.
  • server 140 may store a HD map.
  • server 140 may be responsible for calculating a confidence of an HD map based on crowdsourcing and constructing the HD map based on the confidence. Instead of constructing HD map based on sensor data acquired from one survey vehicle, server 140 may crowdsource data captured of a target region by multiple sensors at varying view positions and integrate such data to construct and update the HD map. For example, server 140 may crowdsource data from terminals 131 and 132. Server 140 may communicate with terminals 131 and 132 via a network, such as a Wireless Local Area Network (WLAN) , a Wide Area Network (WAN) , wireless networks such as radio waves, a nationwide cellular network, a satellite communication network, and/or a local wireless network (e.g., Bluetooth TM or WiFi) . Server 140 may receive data from terminals 131 and 132. It is contemplated that server 140 may crowdsource from more terminals than those illustrated in FIG. 1.
  • WLAN Wireless Local Area Network
  • WAN Wide Area Network
  • wireless networks such as radio waves, a nationwide cellular network, a satellite communication network,
  • terminals 131 and 132 may be mobile sensing system configured to move around a target region to acquire sensor data of the target region.
  • terminals 131 and 132 may each be or include a Light LiDAR radar, a high-resolution camera, a GPS, an IMU or other cost-effective sensing device.
  • terminals 131 and 132 may be installed, mounted, or otherwise attached to a vehicle, such that the terminals may be carried around by the vehicle.
  • the vehicle may be configured to be operated by an operator occupying the vehicle, remotely controlled, and/or autonomous.
  • a Simultaneous Localization and Mapping (SLAM) method may be performed to combine data from different sensing devices by server 140.
  • GPS/IMU signals and point cloud data may provide additional information to the SLAM algorithm, thus enhancing the positioning accuracy and reliability when used by autonomous vehicle to position itself.
  • each of terminals 131 and 132 may include a combination of different sensing devices for acquiring sensor data of a target region 110.
  • the combination may include a LiDAR, a high-resolution camera, a GPS, an IMU or other cost-effective sensing devices.
  • Sensor data acquired by different sensing devices may be combined using a SLAM algorithm by server 140.
  • terminal 131 may detect map elements (e.g., lane markings 111 and 112, and a zebra line 113) within target region 110 as illustrated in FIG. 1.
  • Terminal 131 may determine the confidence of sensor data acquired by its’ sensors. The sensor data and the confidence of the sensor data may be provided to server 140.
  • server 140 may determine the confidence of map elements based on the confidence of the sensor data and an error propagation rule. Server 140 may also determine an initial position of target region 110 based on the sensor data. For example, server 140 may use the sensor data to construct an initial HD map of target region 110 based on optimizing pose information of a plurality of map elements (e.g., optimizing pose information of lane markings 111 and 112, and zebra line 113) within the sensor data. Server 140 may further initiate a HD map construction process.
  • optimizing pose information of a plurality of map elements e.g., optimizing pose information of lane markings 111 and 112, and zebra line 113
  • server 140 may match the initial HD map with the sensor data of target region 110 acquired by other terminals, such as terminal 132. For example, server 140 may identify sensor data acquired from terminal 132 as also capturing at least a part of target region 110 and match the initial map of target region 110 with the identified sensor data. Based on the matched additional sensor data, server 140 may further update the initial HD map to construct an updated HD map.
  • terminals 131 and 132 may acquire sensor data from different angles and/or at distances relative to target region 110, they may acquire data of a same scene from different view positions. Accordingly, the different characteristics of the sensors as well as the varying view positions the sensors use to capture the sensor data may result in different estimated positions of each map element. These estimated positions are also associated with different confidences, as determined based on the different sets of sensor data. For example, for a map element captured by sensor data acquired from both terminals 131 and 132, two estimated positions may be generated for the element, each associated with a different confidence. In some embodiments, the different estimated positions and confidences may be aggregated to become the position and confidence of map elements of the constructed HD map. As the confidence of map elements in the HD map is determined based on aggregating different observations from multiple sensing devices, the constructed HD map may be more reliable when being used in autonomous driving vehicle’s positioning.
  • FIG. 1 illustrates target region 110 as containing exemplary map elements such as lane markings 111 and 112, and a zebra line 113, it is contemplated that the disclosed systems and methods may apply to construct and update an HD map that includes other map elements.
  • map elements may also include traffic signs, buildings or trees, etc. within target region 110.
  • FIG. 2 illustrates a block diagram of an exemplary system for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure.
  • server 140 may collect sensor data through terminal 131 and 132 and integrate the data from multiple sources, calculate confidences of map elements within an HD map, and construct the HD map based on an aggregated confidence.
  • server 140 may include a communication interface 202, a processor 204, a memory 212, and a storage 214.
  • server 140 may have different modules in a single device, such as a standalone computing device, or separated devices with dedicated functions.
  • one or more components of server 140 may be located in a cloud or may be alternatively in a single location or distributed locations. Components of server 140 may be in an integrated device or distributed at different locations but communicate with each other through a network (not shown) .
  • Server 140 may communicate with terminals 131 and 132 through a network. Although omitted by FIG. 2, it is contemplated that each of terminals 131 and 132 may also include hardware components similar to those shown in server 140, such as a processor 230, a communication interface (not shown) , a memory (not shown) and a storage (not shown) .
  • Communication interface 202 may send data to and receive data from terminals 131 and 132 or other system or device the terminals are attached to (e.g., a vehicle) via communication cables, a Wireless Local Area Network (WLAN) , a Wide Area Network (WAN) , wireless networks such as radio waves, a nationwide cellular network, and/or a local wireless network (e.g., Bluetooth TM or WiFi) , or other communication methods.
  • communication interface 202 can be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection.
  • ISDN integrated services digital network
  • communication interface 202 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
  • Wireless links can also be implemented by communication interface 202.
  • communication interface 202 can send and receive electrical, electromagnetic or optical signals that carry digital data streams representing various types of information via a network.
  • communication interface 202 may receive sensor data captured by terminals 131 and 132 and provide the received sensor data to storage 214 for storage or to processor 204 for processing.
  • Processor 204 and processor 230 may include any appropriate type of general-purpose or special-purpose microprocessor, digital signal processor, or microcontroller.
  • processor 230 may be configured to acquire sensor data for constructing an HD map and calculating confidences of the captured sensor data.
  • Processor 204 may be configured as a separate processor module dedicated to combine the sensor data acquired by multiple terminals, calculating an aggregate confidence of the HD map, and constructing an HD map based thereon.
  • processor 204 and processor 230 may be configured as a shared processor module for performing other functions unrelated to HD map construction.
  • terminal 131 may include a sensor data acquisition unit 231 and processor 230.
  • Processor 230 may include a sensor data confidence unit 232, and the like. These modules (and any corresponding sub-modules or sub-units) can be hardware units (e.g., portions of an integrated circuit) designed for use with other components or to execute a part of a program.
  • the program may be stored on a computer-readable medium, and when executed by processor 230, it may perform one or more functions.
  • FIG. 2 shows only unit 232 is within processor 230, it is contemplated that other the like units (e.g., sensor data acquisition unit 231) may be within processor 230 or alternatively, may be distributed among multiple processors located near or remotely with each other.
  • Sensor data acquisition unit 231 may be configured to capture sensor data of target region 110.
  • Sensor data acquisition unit 231 may be coupled with sensors used in a navigation unit, such as a GPS receiver and one or more IMU sensors.
  • a GPS is a global navigation satellite system that provides geolocation and time information to a GPS receiver.
  • An IMU is an electronic device that measures and provides a vehicle’s specific force, angular rate, and sometimes the magnetic field surrounding the vehicle, using various inertial sensors, such as accelerometers and gyroscopes, sometimes also magnetometers.
  • the combined GPS receiver and IMU sensor can provide real-time pose information of terminal 131 as it travels, including the positions and orientations (e.g., Euler angles) of terminal 131 at each time point.
  • Sensor data acquisition unit 231 may also be coupled with other positioning sensors, such as a high-resolution camera and a LiDAR scanner.
  • a high-resolution camera may include an image acquisition unit which may be configured to capture images of surrounding objects.
  • the image acquisition unit may include a controller controlling the setting and operation of a monocular camera.
  • the image acquisition unit may control and adjust the focus, aperture, shutter speed, white balance, metering, filters, and other settings of the camera.
  • the image acquisition unit may control and adjust the orientation and position of the camera so that the camera captures an image at a predetermined view angle and position.
  • the high-resolution camera may be set to capture images upon triggers, continuously, or periodically, and each image captured at a time point is called a frame.
  • a LiDAR scanner may capture point cloud data of surrounding objects. LiDAR measures distance to a target by illuminating the target with pulsed laser light and measuring the reflected pulses with a sensor. Differences in laser return times and wavelengths can then be used to make digital 3-D representations of the target.
  • the light used for LiDAR scan may be ultraviolet, visible, or near infrared.
  • Sensor data confidence unit 232 may be configured to determine the confidence of the sensor data.
  • the confidences of the respective sensing devices e.g., a GPS/IMU sensor, a LiDAR scanner and/or high-resolution camera
  • the confidences of the sensor data may be determined by sensor data confidence unit 232 based on device specifications and data acquisition settings.
  • terminal 132 may have the same or different structure as terminal 131 or may be any structure capable of providing one or more types of different sensor data (e.g., capable of providing one or more of images, GPS/IMU data and/or point cloud data) to server 140.
  • terminal 132 may capture sensor data of target region 110 subsequent to terminal 131.
  • processor 204 may include multiple modules, such as a HD map construction unit 241, a HD map confidence unit 242, and the like. These modules (and any corresponding sub-modules or sub-units) can be hardware units (e.g., portions of an integrated circuit) of processor 204 designed for use with other components or to execute a part of a program.
  • the program may be stored on a computer-readable medium, and when executed by processor 204, it may perform one or more functions.
  • FIG. 2 shows units 241 and unit 242 all within one processor 204, it is contemplated that these units may be distributed among multiple processors located near or remotely with each other.
  • HD map construction unit 241 may construct an initial HD map.
  • HD map construction unit 241 may be configured to combine different types of sensor data, e.g., captured by terminal 131, by performing a SLAM algorithm.
  • GPS/IMU data and the point cloud data may provide absolute position information that may be used as constraints or a priori information to improve the SLAM algorithm.
  • HD map construction unit 241 may also use a dynamic model to balance the contributions from various sensing devices. Because SLAM algorithm applied to monocular images alone typically cannot provide accurate position information, and position errors may accumulate, consistent with the present disclosure, HD map construction unit 241 may integrate GPS, IMU and LiDAR data into the SLAM algorithm to generate a more accurate and reliable initial HD map of target region 110.
  • HD map construction unit 241 may calculate the position of each map element within the HD map based on the initial HD map. For example, for a map element i (e.g., one of lane markings 111 and 112, or zebra line 113) , the original position observed by a first sensing device (e.g., terminal 131) may be P 0 .
  • the map element i may be further observed by the same or different terminals (e.g., terminal 131, terminal 132 and/or other similar functional terminals) for N more times.
  • HD map construction unit 241 may subsequently receive the second sensor data acquired by terminal 132 and confidences associated with the second sensor data from communication interface 202.
  • the second sensor data may also include sensor data acquired by multiple sensor devices.
  • HD map construction unit 241 may identify the second sensor data that captures at least part of target region 110 among multiple sensor data received subsequently.
  • HD map construction unit 241 may perform similar positioning method as described above with respect to the first sensor data, e.g., the SLAM method, on the second sensor data to determine a position P 1 of map element i.
  • the observed positions of map element i in those observation are P 1 ...P N respectively.
  • the weight of those terminals are ⁇ 1 ... ⁇ N (e.g., the weight of different terminals may be pre-allocated based on the accuracy of the sensors used by the terminals) and wherein the weight of the terminals further restricted by equation (1) :
  • HD map confidence unit 242 may calculate an aggregated position of the map element i in the HD map according to equation (2) :
  • HD map confidence unit 242 may calculate confidence of each map element within the constructed HD map corresponding to positions P 1 ...P N .
  • the confidence of the initial HD map may be determined based on the confidence of the sensing devices and the error propagation (e.g., to calculate a variance or standard deviation of map elements based on the sensing device confidence) associated with the HD map construction. For example, following the above example, the confidence of the position observed by the first sensing device i is ⁇ 0 , the confidences of the positions observed by the following sets of sensing devices are ⁇ 1 ... ⁇ N , the aggregated confidence of the position of the map element i in the HD map may be determined by equation (2) :
  • terminal 132 or other similar terminals may capture a data frame at each of a sequence of time points. Multiple data frames may be combined (e.g., through time/space shifting) to form the sensor data. Server 140 may calculate the aggregated position and confidence of the map element when each data frame is received, when multiple data frames are received, when all the sensor data captured by the terminal is received, or any suitable combination of the above. In another word, the sensor data may be streamed as they become available which may enable server 140 to process the sensor data frame by frame in real-time while subsequent frames are being captured. Alternatively, data may be transmitted in bulk after a section of, or the entire survey is completed.
  • Memory 212 and storage 214 may include any appropriate type of mass storage provided to store any type of information that processor 204 may need to operate.
  • Memory 212 and storage 214 may be a volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other type of storage device or tangible (i.e., non-transitory) computer-readable medium including, but not limited to, a ROM, a flash memory, a dynamic RAM, and a static RAM.
  • Memory 212 and/or storage 214 may be configured to store one or more computer programs that may be executed by processor 204 to perform map update functions disclosed in this application.
  • memory 212 and/or storage 214 may be configured to store program (s) that may be executed by processor 204 to communicate with terminals 131-134 for image acquisitions, and update a HD map using the images.
  • Memory 212 and/or storage 214 may be further configured to store information and data used by processor 204.
  • memory 212 and/or storage 214 may be configured to store the HD map, including its point cloud data, and images captured by terminals 131-134, the machine learning models (e.g., the model parameters) and the feature maps, and other intermediate data created during the processing. These data may be stored permanently, removed periodically, or disregarded immediately after each frame of data is processed.
  • FIG. 3 is a flowchart of an exemplary method 300 performed by a server for constructing an HD map and calculating a confidence of the HD map, according to embodiments of the disclosure.
  • method 300 may be implemented by processor 204 of server 140. It is contemplated that the method 300 may also be partially or in all implemented by processor 230 of terminal 131. For example, steps for processing the sensor data received by terminal 131 such as S302-S306 may be performed by processor 230.
  • Method 300 may include steps S302-S316 as described below based on one of the exemplary embodiments that all the steps are performed on server 140.
  • sever 140 may be configured to receive sensor data of a target region from terminal (s) (e.g., terminal 131) .
  • sensor data may include GPS and IMU information of the target region.
  • Sensor data may also include images and/or point cloud data of surrounding objects.
  • server 140 may be configured to construct an initial HD map and determine an initial position of the target region based on the sensor data.
  • server 140 may perform a SLAM algorithm to combine different types of sensor data to construct the initial HD map and determine the initial position of the target region (e.g., both GPS/IMU data and the point cloud data may be used as constraints or a priori information to the SLAM algorithm applied to images data of the target region) .
  • server 140 may further be configured to calculate the confidence of the initial map.
  • the confidence of the initial map may be determined based on the confidence of the sensing devices and the error propagation (e.g., to calculate a variance or standard deviation of map elements based on the sensing device confidence) associated with the HD map construction.
  • server 140 may be configured to receive additional sensor data from terminals other than terminal 131 or from another service trip of terminal 131.
  • server 140 may be configured to determine a second position of the target region based on the additional sensor data received in step S308 and calculate a confidence of the second position.
  • Server 140 may determine the second position of the target region using the same or similar method as method used in S304 for determining the initial position of the target region.
  • Server 140 may further determine the confidence of the second position using the same or similar method server 140 used in step S306 for calculating the confidence of the initial map.
  • server 140 may be configured to determine an aggregated position of the target region.
  • server 140 may determine the aggregated position of the target region HD map based on registering the second sensor data to the initial map.
  • server 140 may establish constraints between the initial map and the second sensor data (e.g., based on odometry and/or registration of the initial map and the second sensor data) and register the second sensor data to the initial map to determine an aggregated position of the target region.
  • the original position observed by a first set of sensing devices is P 0
  • the map element i has been further observed by following sets of sensing devices (terminal 132 and/or other similar functional terminals) for N more times.
  • the observed positions of map element i in those observation are P 1 ...P N respectively.
  • the weight of those terminals are ⁇ 1 ... ⁇ N (e.g., the weight of different terminals may be pre-allocated based on the accuracy of the sensors used by the terminals) and wherein the weight of the terminals is further restricted by equation (1) :
  • the aggregated position of the map element i in the HD map may be determined by equation (2) :
  • server 140 may be configured to determine an aggregated confidence of the target region. For example, following the above example, the confidence of the position observed by the first set of sensing devices (e.g., terminal 131) i is ⁇ 0 , the confidences of the positions observed by the following sets of sensing devices are ⁇ 1 ... ⁇ N , the aggregated confidence of the position of the map element i in the HD map may be determined as shown in equation (3) :
  • server 140 may be configured to update/construct the initial HD map based on the aggregated confidence and the aggregated position. As the confidence of map elements in the HD map is determined based on aggregating different observations from multiple sensing devices, the constructed HD map may be more reliable when being used in autonomous driving vehicle’s positioning.
  • the computer-readable medium may include volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other types of computer-readable medium or computer-readable storage devices.
  • the computer-readable medium may be the storage device or the memory module having the computer instructions stored thereon, as disclosed.
  • the computer-readable medium may be a disc or a flash drive having the computer instructions stored thereon.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Methods and systems for constructing an HD map based on a confidence. The system may include a communication interface (202) configured to communicate with the plurality of terminals (131,132) via a network. The system may further include a storage (214) configured to store the HD map. The system may also include at least one processor (204). The at least one processor (204) may be configured to determine a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data. The at least one processor (204) may be further configured to determine an aggregated position of the map element by weighting the first position and the second position. The at least one processor (204) may be further configured to determine an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence. The at least one processor (204) may be further configured to construct the HD map based on the aggregated position and the aggregated confidence.

Description

[Title established by the ISA under Rule 37.2] SYSTEMS AND METHODS FOR CONSTRUCTING HIGH-DEFINITION MAP WITH ITS CONFIDENCE DETERMINED BASED ON CROWDSOURCING TECHNICAL FIELD
The present disclosure relates to systems and methods for constructing High-definition (HD) maps, and more particularly to, systems and methods for constructing an HD map with its confidence determined based on crowdsourcing.
BACKGROUND
Autonomous driving technology relies heavily on accurate positioning. For example, using Global Positioning System (GPS) for positioning is largely affected by the surrounding objects, such as trees and buildings. Those objects will reflect or block the GPS signal and thus cause observations from different spot using the same equipment generate result with different accuracy. For example, when using GPS to position in open area, the accuracy can reach a meter level, and when using Real Time Kinematic (RTK) , an enhanced satellite-based navigation technique, the accuracy can reach a centimeter level. However, when those systems are used in urban or canyon area, because of the blocking and reflection of the signal, the accuracy of positioning would be largely degraded.
A typical solution for the above problems is to equip an autonomous driving vehicle with combined sensors. For example, one or more sensors such as a Light Detection and Ranging (LiDAR) radar, a high-resolution camera, a Global Positioning System (GPS) , or an Inertial Measurement Unit (IMU) , may be combined with HD maps to provide accurate positioning result. For example, in practice, HD maps can be generated by using GPS/IMU combined sensors to provide absolute pose and position information. A more accurate solution  is to further combine the GPS/IMU combined sensor with LiDAR and high-resolution camera using Simultaneous Localization and Mapping (SLAM) method. The accuracy of these systems, however, still largely depends on the accuracy of the combined sensors and the accuracy of the HD maps.
In autonomous driving, accuracy of an HD map is usually measured by a “confidence” of the positions of the map elements. Confidence is crucial in positioning an autonomous driving vehicle. Because there is no viable means to determine the true confidence of the HD map, it is typically determined based on the confidence of the sensor signal. For example, the confidence of GPS signal is represented using the Dilution of Precision (DOP) of the signal. However, because the DOP value of a signal depends on the distribution of the observing satellites, DOP value of the same location observed at different time can be different.
Embodiments of the disclosure address the above problems by methods and systems that provide an improved solution for calculating a confidence of the HD map based on crowdsourcing.
SUMMARY
Embodiments of the disclosure disclose a method for constructing an HD map based on a confidence. The method may include receiving, by a communication interface, a first sensor data acquired of a target region by a first sensor and a second sensor data acquired of the target region by a second sensor. The method may further include determining, by at least one processor, a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data. The method may also include determining, by the at least one processor, an aggregated  position of the map element by weighting the first position and the second position. The method may also include determining, by the at least one processor, an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence. The method may also include constructing, by the at least one processor, the HD map based on the aggregated position and the aggregated confidence.
Embodiments of the disclosure further provide a system for constructing an HD map based on a confidence. The system may include a communication interface configured to communicate with a plurality of terminals via a network. The system may further include a storage configured to store the HD map. The system may also include at least one processor. The at least one processor may be configured to determine a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data. The at least one processor may be further configured to determine an aggregated position of the map element by weighting the first position and the second position. The at least one processor may also be configured to determine an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence. The at least one processor may also be configured to construct the HD map based on the aggregated position and the aggregated confidence.
Embodiments of the disclosure further disclose a non-transitory computer-readable medium having a computer program stored thereon. The computer program, when executed by at least one processor, may perform a method for constructing an HD map based on a confidence. The method may include receiving a first sensor data acquired of a target region by a first sensor and a second sensor data acquired of the target region by a second sensor. The  method may further include instructing determining a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data. The method may also include determining an aggregated position of the map element by weighting the first position and the second position. The method may also include determining an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence. The method may also include determining an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a schematic diagram of an exemplary system for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure.
FIG. 2 illustrates a block diagram of an exemplary system for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure.
FIG. 3 is a flowchart of an exemplary method for constructing an HD map , according to embodiments of the disclosure.
DETAILED DESCRIPTION
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
FIG. 1 illustrates a schematic diagram of an exemplary system 100 for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure. Consistent with some embodiments, system 100 may include a server 140 communicatively connected with a plurality of terminals, including  terminals  131 and 132. In some embodiments, server 140 may be a local physical server, a cloud server (as illustrated in FIG. 1) , a virtual server, a distributed server, or any other suitable computing device. Consistent with the present disclosure, server 140 may store a HD map.
Consistent with the present disclosure, server 140 may be responsible for calculating a confidence of an HD map based on crowdsourcing and constructing the HD map based on the confidence. Instead of constructing HD map based on sensor data acquired from one survey vehicle, server 140 may crowdsource data captured of a target region by multiple sensors at varying view positions and integrate such data to construct and update the HD map. For example, server 140 may crowdsource data from  terminals  131 and 132. Server 140 may communicate with  terminals  131 and 132 via a network, such as a Wireless Local Area Network (WLAN) , a Wide Area Network (WAN) , wireless networks such as radio waves, a nationwide cellular network, a satellite communication network, and/or a local wireless network (e.g., Bluetooth TM or WiFi) . Server 140 may receive data from  terminals  131 and 132. It is contemplated that server 140 may crowdsource from more terminals than those illustrated in FIG. 1.
In some embodiments,  terminals  131 and 132 may be mobile sensing system configured to move around a target region to acquire sensor data of the target region. For example,  terminals  131 and 132 may each be or include a Light LiDAR radar, a high-resolution camera, a GPS, an IMU or other cost-effective sensing device. In some embodiments,  terminals  131 and 132 may be installed, mounted, or otherwise attached to a vehicle, such that the terminals may be carried around by the vehicle. The vehicle may be configured to be operated by an operator occupying the vehicle, remotely controlled, and/or autonomous. In some embodiments, a Simultaneous Localization and Mapping (SLAM) method may be performed to combine data from different sensing devices by server 140. GPS/IMU signals and point cloud data may provide additional information to the SLAM algorithm, thus enhancing the positioning accuracy and reliability when used by autonomous vehicle to position itself.
In some embodiments, each of  terminals  131 and 132 may include a combination of different sensing devices for acquiring sensor data of a target region 110. For example, the combination may include a LiDAR, a high-resolution camera, a GPS, an IMU or other cost-effective sensing devices. Sensor data acquired by different sensing devices may be combined using a SLAM algorithm by server 140. From the acquired sensor data, terminal 131 may detect map elements (e.g.,  lane markings  111 and 112, and a zebra line 113) within target region 110 as illustrated in FIG. 1. Terminal 131 may determine the confidence of sensor data acquired by its’ sensors. The sensor data and the confidence of the sensor data may be provided to server 140.
Because the initial confidences of each sensor data acquired from respective sensors of terminal 131 is known based on the nature of the sensors and the condition of the sensing (e.g. the time point of the sensing, especially for the locating data acquired by GPS) , server 140 may determine the confidence of map elements based on the confidence of the sensor  data and an error propagation rule. Server 140 may also determine an initial position of target region 110 based on the sensor data. For example, server 140 may use the sensor data to construct an initial HD map of target region 110 based on optimizing pose information of a plurality of map elements (e.g., optimizing pose information of  lane markings  111 and 112, and zebra line 113) within the sensor data. Server 140 may further initiate a HD map construction process. In some embodiments, server 140 may match the initial HD map with the sensor data of target region 110 acquired by other terminals, such as terminal 132. For example, server 140 may identify sensor data acquired from terminal 132 as also capturing at least a part of target region 110 and match the initial map of target region 110 with the identified sensor data. Based on the matched additional sensor data, server 140 may further update the initial HD map to construct an updated HD map.
Because  terminals  131 and 132 may acquire sensor data from different angles and/or at distances relative to target region 110, they may acquire data of a same scene from different view positions. Accordingly, the different characteristics of the sensors as well as the varying view positions the sensors use to capture the sensor data may result in different estimated positions of each map element. These estimated positions are also associated with different confidences, as determined based on the different sets of sensor data. For example, for a map element captured by sensor data acquired from both  terminals  131 and 132, two estimated positions may be generated for the element, each associated with a different confidence. In some embodiments, the different estimated positions and confidences may be aggregated to become the position and confidence of map elements of the constructed HD map. As the confidence of map elements in the HD map is determined based on aggregating different observations from  multiple sensing devices, the constructed HD map may be more reliable when being used in autonomous driving vehicle’s positioning.
Although FIG. 1 illustrates target region 110 as containing exemplary map elements such as  lane markings  111 and 112, and a zebra line 113, it is contemplated that the disclosed systems and methods may apply to construct and update an HD map that includes other map elements. For example, map elements may also include traffic signs, buildings or trees, etc. within target region 110.
FIG. 2 illustrates a block diagram of an exemplary system for constructing an HD map with its confidence determined based on crowdsourcing, according to embodiments of the disclosure. Consistent with the present disclosure, server 140 may collect sensor data through  terminal  131 and 132 and integrate the data from multiple sources, calculate confidences of map elements within an HD map, and construct the HD map based on an aggregated confidence.
In some embodiments, as shown in FIG. 2, server 140 may include a communication interface 202, a processor 204, a memory 212, and a storage 214. In some embodiments, server 140 may have different modules in a single device, such as a standalone computing device, or separated devices with dedicated functions. In some embodiments, one or more components of server 140 may be located in a cloud or may be alternatively in a single location or distributed locations. Components of server 140 may be in an integrated device or distributed at different locations but communicate with each other through a network (not shown) .
Server 140 may communicate with  terminals  131 and 132 through a network. Although omitted by FIG. 2, it is contemplated that each of  terminals  131 and 132 may also  include hardware components similar to those shown in server 140, such as a processor 230, a communication interface (not shown) , a memory (not shown) and a storage (not shown) .
Communication interface 202 may send data to and receive data from  terminals  131 and 132 or other system or device the terminals are attached to (e.g., a vehicle) via communication cables, a Wireless Local Area Network (WLAN) , a Wide Area Network (WAN) , wireless networks such as radio waves, a nationwide cellular network, and/or a local wireless network (e.g., Bluetooth TM or WiFi) , or other communication methods. In some embodiments, communication interface 202 can be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection. As another example, communication interface 202 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links can also be implemented by communication interface 202. In such an implementation, communication interface 202 can send and receive electrical, electromagnetic or optical signals that carry digital data streams representing various types of information via a network.
Consistent with some embodiments of the present disclosure, communication interface 202 may receive sensor data captured by  terminals  131 and 132 and provide the received sensor data to storage 214 for storage or to processor 204 for processing.
Processor 204 and processor 230 may include any appropriate type of general-purpose or special-purpose microprocessor, digital signal processor, or microcontroller. In some embodiments, processor 230 may be configured to acquire sensor data for constructing an HD map and calculating confidences of the captured sensor data. Processor 204 may be configured as a separate processor module dedicated to combine the sensor data acquired by multiple terminals, calculating an aggregate confidence of the HD map, and constructing an HD map  based thereon. Alternatively, processor 204 and processor 230 may be configured as a shared processor module for performing other functions unrelated to HD map construction.
As shown in FIG. 2, terminal 131 may include a sensor data acquisition unit 231 and processor 230. Processor 230 may include a sensor data confidence unit 232, and the like. These modules (and any corresponding sub-modules or sub-units) can be hardware units (e.g., portions of an integrated circuit) designed for use with other components or to execute a part of a program. The program may be stored on a computer-readable medium, and when executed by processor 230, it may perform one or more functions. Although FIG. 2 shows only unit 232 is within processor 230, it is contemplated that other the like units (e.g., sensor data acquisition unit 231) may be within processor 230 or alternatively, may be distributed among multiple processors located near or remotely with each other.
Sensor data acquisition unit 231 may be configured to capture sensor data of target region 110. Sensor data acquisition unit 231 may be coupled with sensors used in a navigation unit, such as a GPS receiver and one or more IMU sensors. A GPS is a global navigation satellite system that provides geolocation and time information to a GPS receiver. An IMU is an electronic device that measures and provides a vehicle’s specific force, angular rate, and sometimes the magnetic field surrounding the vehicle, using various inertial sensors, such as accelerometers and gyroscopes, sometimes also magnetometers. The combined GPS receiver and IMU sensor can provide real-time pose information of terminal 131 as it travels, including the positions and orientations (e.g., Euler angles) of terminal 131 at each time point.
Sensor data acquisition unit 231 may also be coupled with other positioning sensors, such as a high-resolution camera and a LiDAR scanner. A high-resolution camera may include an image acquisition unit which may be configured to capture images of surrounding  objects. In some embodiments, the image acquisition unit may include a controller controlling the setting and operation of a monocular camera. For example, the image acquisition unit may control and adjust the focus, aperture, shutter speed, white balance, metering, filters, and other settings of the camera. In some embodiments, the image acquisition unit may control and adjust the orientation and position of the camera so that the camera captures an image at a predetermined view angle and position. In some embodiments, the high-resolution camera may be set to capture images upon triggers, continuously, or periodically, and each image captured at a time point is called a frame. A LiDAR scanner may capture point cloud data of surrounding objects. LiDAR measures distance to a target by illuminating the target with pulsed laser light and measuring the reflected pulses with a sensor. Differences in laser return times and wavelengths can then be used to make digital 3-D representations of the target. The light used for LiDAR scan may be ultraviolet, visible, or near infrared.
Sensor data confidence unit 232 may be configured to determine the confidence of the sensor data. In some embodiments, the confidences of the respective sensing devices (e.g., a GPS/IMU sensor, a LiDAR scanner and/or high-resolution camera) is pre-determined and pre-programed in sensor data confidence unit 232, e.g., as a look-up table. In some other embodiments, the confidences of the sensor data may be determined by sensor data confidence unit 232 based on device specifications and data acquisition settings.
Although not shown in FIG. 2, terminal 132 may have the same or different structure as terminal 131 or may be any structure capable of providing one or more types of different sensor data (e.g., capable of providing one or more of images, GPS/IMU data and/or point cloud data) to server 140. In some embodiments, terminal 132 may capture sensor data of target region 110 subsequent to terminal 131.
Upon receiving the first sensor data capturing target region 110, and confidences of the sensor data from terminal 131, communication interface 202 may send the data to processor 204. As shown in FIG. 2, processor 204 may include multiple modules, such as a HD map construction unit 241, a HD map confidence unit 242, and the like. These modules (and any corresponding sub-modules or sub-units) can be hardware units (e.g., portions of an integrated circuit) of processor 204 designed for use with other components or to execute a part of a program. The program may be stored on a computer-readable medium, and when executed by processor 204, it may perform one or more functions. Again, although FIG. 2 shows units 241 and unit 242 all within one processor 204, it is contemplated that these units may be distributed among multiple processors located near or remotely with each other.
HD map construction unit 241 may construct an initial HD map. In some embodiments, HD map construction unit 241 may be configured to combine different types of sensor data, e.g., captured by terminal 131, by performing a SLAM algorithm. For example, both GPS/IMU data and the point cloud data may provide absolute position information that may be used as constraints or a priori information to improve the SLAM algorithm. HD map construction unit 241 may also use a dynamic model to balance the contributions from various sensing devices. Because SLAM algorithm applied to monocular images alone typically cannot provide accurate position information, and position errors may accumulate, consistent with the present disclosure, HD map construction unit 241 may integrate GPS, IMU and LiDAR data into the SLAM algorithm to generate a more accurate and reliable initial HD map of target region 110. In some embodiments, HD map construction unit 241 may calculate the position of each map element within the HD map based on the initial HD map. For example, for a map element i (e.g.,  one of  lane markings  111 and 112, or zebra line 113) , the original position observed by a first sensing device (e.g., terminal 131) may be P 0.
The map element i may be further observed by the same or different terminals (e.g., terminal 131, terminal 132 and/or other similar functional terminals) for N more times. HD map construction unit 241 may subsequently receive the second sensor data acquired by terminal 132 and confidences associated with the second sensor data from communication interface 202. The second sensor data may also include sensor data acquired by multiple sensor devices. In some embodiments, HD map construction unit 241 may identify the second sensor data that captures at least part of target region 110 among multiple sensor data received subsequently. HD map construction unit 241 may perform similar positioning method as described above with respect to the first sensor data, e.g., the SLAM method, on the second sensor data to determine a position P 1 of map element i.
Accordingly, for the N observations of target region 110, the observed positions of map element i in those observation are P 1…P N respectively. Given that the weight of those terminals are λ 1…λ N (e.g., the weight of different terminals may be pre-allocated based on the accuracy of the sensors used by the terminals) and wherein the weight of the terminals further restricted by equation (1) :
Figure PCTCN2020106502-appb-000001
In some embodiments, HD map confidence unit 242 may calculate an aggregated position of the map element i in the HD map according to equation (2) :
Figure PCTCN2020106502-appb-000002
HD map confidence unit 242 may calculate confidence of each map element within the constructed HD map corresponding to positions P 1…P N. The confidence of the  initial HD map may be determined based on the confidence of the sensing devices and the error propagation (e.g., to calculate a variance or standard deviation of map elements based on the sensing device confidence) associated with the HD map construction. For example, following the above example, the confidence of the position observed by the first sensing device i is ∑ 0 , the confidences of the positions observed by the following sets of sensing devices are ∑ 1…∑ N, the aggregated confidence of the position of the map element i in the HD map may be determined by equation (2) :
Figure PCTCN2020106502-appb-000003
It is contemplated that terminal 132 or other similar terminals may capture a data frame at each of a sequence of time points. Multiple data frames may be combined (e.g., through time/space shifting) to form the sensor data. Server 140 may calculate the aggregated position and confidence of the map element when each data frame is received, when multiple data frames are received, when all the sensor data captured by the terminal is received, or any suitable combination of the above. In another word, the sensor data may be streamed as they become available which may enable server 140 to process the sensor data frame by frame in real-time while subsequent frames are being captured. Alternatively, data may be transmitted in bulk after a section of, or the entire survey is completed.
Memory 212 and storage 214 may include any appropriate type of mass storage provided to store any type of information that processor 204 may need to operate. Memory 212 and storage 214 may be a volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other type of storage device or tangible (i.e., non-transitory) computer-readable medium including, but not limited to, a ROM, a flash memory, a dynamic RAM, and a static RAM. Memory 212 and/or storage 214 may be configured to store one or  more computer programs that may be executed by processor 204 to perform map update functions disclosed in this application. For example, memory 212 and/or storage 214 may be configured to store program (s) that may be executed by processor 204 to communicate with terminals 131-134 for image acquisitions, and update a HD map using the images.
Memory 212 and/or storage 214 may be further configured to store information and data used by processor 204. For instance, memory 212 and/or storage 214 may be configured to store the HD map, including its point cloud data, and images captured by terminals 131-134, the machine learning models (e.g., the model parameters) and the feature maps, and other intermediate data created during the processing. These data may be stored permanently, removed periodically, or disregarded immediately after each frame of data is processed.
FIG. 3 is a flowchart of an exemplary method 300 performed by a server for constructing an HD map and calculating a confidence of the HD map, according to embodiments of the disclosure. For example, method 300 may be implemented by processor 204 of server 140. It is contemplated that the method 300 may also be partially or in all implemented by processor 230 of terminal 131. For example, steps for processing the sensor data received by terminal 131 such as S302-S306 may be performed by processor 230.
Method 300 may include steps S302-S316 as described below based on one of the exemplary embodiments that all the steps are performed on server 140.
In step S302, sever 140 may be configured to receive sensor data of a target region from terminal (s) (e.g., terminal 131) . For example, sensor data may include GPS and IMU information of the target region. Sensor data may also include images and/or point cloud data of surrounding objects.
In step S304, server 140 may be configured to construct an initial HD map and determine an initial position of the target region based on the sensor data. For example, server 140 may perform a SLAM algorithm to combine different types of sensor data to construct the initial HD map and determine the initial position of the target region (e.g., both GPS/IMU data and the point cloud data may be used as constraints or a priori information to the SLAM algorithm applied to images data of the target region) .
In step S306, server 140 may further be configured to calculate the confidence of the initial map. For example, the confidence of the initial map may be determined based on the confidence of the sensing devices and the error propagation (e.g., to calculate a variance or standard deviation of map elements based on the sensing device confidence) associated with the HD map construction.
In step S308, server 140 may be configured to receive additional sensor data from terminals other than terminal 131 or from another service trip of terminal 131.
In step S310, server 140 may be configured to determine a second position of the target region based on the additional sensor data received in step S308 and calculate a confidence of the second position. Server 140 may determine the second position of the target region using the same or similar method as method used in S304 for determining the initial position of the target region. Server 140 may further determine the confidence of the second position using the same or similar method server 140 used in step S306 for calculating the confidence of the initial map.
In step S312, server 140 may be configured to determine an aggregated position of the target region. In some embodiments, server 140 may determine the aggregated position of the target region HD map based on registering the second sensor data to the initial map. In some  embodiments, server 140 may establish constraints between the initial map and the second sensor data (e.g., based on odometry and/or registration of the initial map and the second sensor data) and register the second sensor data to the initial map to determine an aggregated position of the target region. For example, for a map element i, the original position observed by a first set of sensing devices (e.g., terminal 131) is P 0, and the map element i has been further observed by following sets of sensing devices (terminal 132 and/or other similar functional terminals) for N more times. The observed positions of map element i in those observation are P 1…P N respectively. Given the weight of those terminals are λ 1…λ N (e.g., the weight of different terminals may be pre-allocated based on the accuracy of the sensors used by the terminals) and wherein the weight of the terminals is further restricted by equation (1) :
Figure PCTCN2020106502-appb-000004
The aggregated position of the map element i in the HD map may be determined by equation (2) :
Figure PCTCN2020106502-appb-000005
In step S314 server 140 may be configured to determine an aggregated confidence of the target region. For example, following the above example, the confidence of the position observed by the first set of sensing devices (e.g., terminal 131) i is ∑ 0 , the confidences of the positions observed by the following sets of sensing devices are ∑ 1…∑ N, the aggregated confidence of the position of the map element i in the HD map may be determined as shown in equation (3) :
Figure PCTCN2020106502-appb-000006
In step S316, server 140 may be configured to update/construct the initial HD map based on the aggregated confidence and the aggregated position. As the confidence of map elements in the HD map is determined based on aggregating different observations from multiple  sensing devices, the constructed HD map may be more reliable when being used in autonomous driving vehicle’s positioning.
Another aspect of the disclosure is directed to a non-transitory computer-readable medium storing instructions which, when executed, cause one or more processors to perform the methods, as discussed above. The computer-readable medium may include volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other types of computer-readable medium or computer-readable storage devices. For example, the computer-readable medium may be the storage device or the memory module having the computer instructions stored thereon, as disclosed. In some embodiments, the computer-readable medium may be a disc or a flash drive having the computer instructions stored thereon.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed system and related methods. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed system and related methods.
It is intended that the specification and examples be considered as exemplary only, with a true scope being indicated by the following claims and their equivalents.

Claims (20)

  1. A method for constructing an HD map based on a confidence, comprising:
    receiving, by a communication interface, a first sensor data acquired of a target region by a first sensor and a second sensor data acquired of the target region by a second sensor;
    determining, by at least one processor, a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data;
    determining, by the at least one processor, an aggregated position of the map element by weighting the first position and the second position;
    determining, by the at least one processor, an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence; and constructing, by the at least one processor, the HD map based on the aggregated position and the aggregated confidence.
  2. The method of claim 1, wherein the first position is determined using a SLAM algorithm.
  3. The method of claim 1, wherein the first confidence is determined based on a confidence of the first sensor data and an error propagation associated with constructing the HD map, and wherein the second confidence is determined based on a confidence of the second sensor data and the error propagation associated with constructing the HD map.
  4. The method of claim 1, further comprising determining a first weight associated with the first sensor data based on an accuracy of the first sensor, and a second weight associated with the second sensor data based on an accuracy of the second sensor.
  5. The method of claim 4, wherein determining the aggregated position further comprises calculating a weighted average of the first position weighted by the first weight and the second position weighted by the second weight.
  6. The method of claim 4, wherein determining the aggregated confidence further comprises calculating a weighted average of the first position weighted by a square of the first confidence and the second confidence weighted by a square of the second confidence.
  7. The method of claim 1, wherein the first and second sensor comprise at least one of a camera, a LiDAR, or a GPS/IMU sensor.
  8. The method of claim 1, wherein the first sensor is equipped on a first terminal and the second sensor is equipped on a second terminal.
  9. The method of claim 8, wherein at least one of the first terminal and the second terminal is an autonomous driving vehicle.
  10. The method of claim 1, further comprising identifying the second sensor data, among a plurality of received sensor data, as being acquired of the same target region as the first sensor data.
  11. A system for constructing an HD map based on a confidence, comprising:
    a communication interface configured to receive a first sensor data acquired of a target region by a first sensor and a second sensor data acquired of the target region by a second sensor;
    a storage configured to store the HD map; and
    at least one processor, configured to:
    determine a first position of a map element within the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data;
    determine an aggregated position of the map element by weighting the first position and the second position;
    determine an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence; and
    construct the HD map based on the aggregated position and the aggregated confidence.
  12. The system of claim 11, wherein the first position is determined using a SLAM algorithm.
  13. The system of claim 11, wherein the first confidence is determined based on a confidence of the first sensor data and an error propagation associated with constructing the HD map and wherein the second confidence is determined based on a confidence of the second sensor data and the error propagation associated with constructing the HD map.
  14. The system of claim 11, wherein to update the initial HD map, the at least one processor is further configured to determine a first weight associated with the first sensor data based on an accuracy of the first sensor and a second weight associated with the second sensor data based on an accuracy of the second sensor.
  15. The system of claim 14, wherein to determine the aggregated position, the at least one processor is further configured to calculate a weighted average of the first position weighted by the first weight and the second position weighted by a square of the second weight.
  16. The system of claim 14, wherein to determine the aggregated confidence, the at least one processor is further configured to calculate a weighted average of the first confidence weighted by a square of the first confidence and the second confidence weighted by a square of the second confidence.
  17. The system of claim 11, wherein the first and second sensor comprise at least one of a camera, a LiDAR, or a GPS/IMU sensor.
  18. The system of claim 11, wherein the first sensor is equipped on a first terminal and the second sensor is equipped on a second terminal.
  19. The system of claim 18, wherein at least one of the first terminal and the second terminal is an autonomous driving vehicle.
  20. A non-transitory computer-readable medium having a computer program stored thereon, wherein the computer program, when executed by at least one processor, performs a method for constructing a HD map based on a confidence, the method comprising:
    receiving a first sensor data acquired of a target region by a first sensor and a second sensor data acquired of the target region by a second sensor;
    determining a first position of a map element of the target region and a first confidence associated with the first position based on the first sensor data, and a second position of the map element and a second confidence associated with the second position based on the second sensor data;
    determining an aggregated position of the map element by weighting the first position and the second position;
    determining an aggregated confidence associated with the aggregated position by weighting the first confidence and the second confidence; and
    constructing the HD map based on the aggregated position and the aggregated confidence.
PCT/CN2020/106502 2020-08-03 2020-08-03 Systems and methods for constructing high-definition map with its confidence determined based on crowdsourcing WO2022027159A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/106502 WO2022027159A1 (en) 2020-08-03 2020-08-03 Systems and methods for constructing high-definition map with its confidence determined based on crowdsourcing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/106502 WO2022027159A1 (en) 2020-08-03 2020-08-03 Systems and methods for constructing high-definition map with its confidence determined based on crowdsourcing

Publications (1)

Publication Number Publication Date
WO2022027159A1 true WO2022027159A1 (en) 2022-02-10

Family

ID=80120101

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/106502 WO2022027159A1 (en) 2020-08-03 2020-08-03 Systems and methods for constructing high-definition map with its confidence determined based on crowdsourcing

Country Status (1)

Country Link
WO (1) WO2022027159A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160034824A1 (en) * 2014-08-04 2016-02-04 International Business Machines Corporation Auto-analyzing spatial relationships in multi-scale spatial datasets for spatio-temporal prediction
US20190080203A1 (en) * 2017-09-11 2019-03-14 Baidu Online Network Technology (Beijing) Co, Ltd Method And Apparatus For Outputting Information
CN109556615A (en) * 2018-10-10 2019-04-02 吉林大学 The driving map generation method of Multi-sensor Fusion cognition based on automatic Pilot
CN110799804A (en) * 2017-06-30 2020-02-14 深圳市大疆创新科技有限公司 Map generation system and method
CN111094896A (en) * 2017-09-08 2020-05-01 罗伯特·博世有限公司 Method and apparatus for creating map

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160034824A1 (en) * 2014-08-04 2016-02-04 International Business Machines Corporation Auto-analyzing spatial relationships in multi-scale spatial datasets for spatio-temporal prediction
CN110799804A (en) * 2017-06-30 2020-02-14 深圳市大疆创新科技有限公司 Map generation system and method
CN111094896A (en) * 2017-09-08 2020-05-01 罗伯特·博世有限公司 Method and apparatus for creating map
US20190080203A1 (en) * 2017-09-11 2019-03-14 Baidu Online Network Technology (Beijing) Co, Ltd Method And Apparatus For Outputting Information
CN109556615A (en) * 2018-10-10 2019-04-02 吉林大学 The driving map generation method of Multi-sensor Fusion cognition based on automatic Pilot

Similar Documents

Publication Publication Date Title
US10896539B2 (en) Systems and methods for updating highly automated driving maps
KR102425272B1 (en) Method and system for determining a position relative to a digital map
US10767990B2 (en) Device, method, and system for processing survey data, and program therefor
EP3506203A1 (en) Method and apparatus for fusing point cloud data technical field
KR102664900B1 (en) Apparatus for measuring ground control point using unmanned aerial vehicle and method thereof
US10996072B2 (en) Systems and methods for updating a high-definition map
CN111436216A (en) Method and system for color point cloud generation
US10732298B2 (en) Operating device, operating method, operating system, and operating program
KR101444685B1 (en) Method and Apparatus for Determining Position and Attitude of Vehicle by Image based Multi-sensor Data
CN113710988A (en) Method for detecting the functional capability of an environmental sensor, control unit and vehicle
US11221216B2 (en) Placement table for unmanned aerial vehicle, surveying method, surveying device, surveying system and program
CN112601928A (en) Position coordinate estimation device, position coordinate estimation method, and program
US10337863B2 (en) Survey system
RU2584368C1 (en) Method of determining control values of parameters of spatial-angular orientation of aircraft on routes and pre-aerodrome zones in flight tests of pilot-navigation equipment and system therefor
CN113959457B (en) Positioning method and device for automatic driving vehicle, vehicle and medium
US11474193B2 (en) Camera calibration for localization
KR101183866B1 (en) Apparatus and method for real-time position and attitude determination based on integration of gps, ins and image at
KR20200002219A (en) Indoor navigation apparatus and method
CN114820793A (en) Target detection and target point positioning method and system based on unmanned aerial vehicle
KR20100060472A (en) Apparatus and method for recongnizing position using camera
KR20200036405A (en) Apparatus and method for correcting longitudinal position error of fine positioning system
WO2022027159A1 (en) Systems and methods for constructing high-definition map with its confidence determined based on crowdsourcing
US20220404170A1 (en) Apparatus, method, and computer program for updating map
CN115854996A (en) Mileage stake calibration method and system
CN113874681B (en) Evaluation method and system for point cloud map quality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20948813

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 28/03/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20948813

Country of ref document: EP

Kind code of ref document: A1