WO2020206774A1 - Systems and methods for positioning - Google Patents

Systems and methods for positioning Download PDF

Info

Publication number
WO2020206774A1
WO2020206774A1 PCT/CN2019/085637 CN2019085637W WO2020206774A1 WO 2020206774 A1 WO2020206774 A1 WO 2020206774A1 CN 2019085637 W CN2019085637 W CN 2019085637W WO 2020206774 A1 WO2020206774 A1 WO 2020206774A1
Authority
WO
WIPO (PCT)
Prior art keywords
local map
similarity
map
subject
cells
Prior art date
Application number
PCT/CN2019/085637
Other languages
English (en)
French (fr)
Inventor
Tingbo Hou
Xiaozhi Qu
Original Assignee
Beijing Voyager Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Voyager Technology Co., Ltd. filed Critical Beijing Voyager Technology Co., Ltd.
Publication of WO2020206774A1 publication Critical patent/WO2020206774A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device

Definitions

  • This present disclosure generally relates to systems and methods for positioning technology, and in particular, to systems and methods for determining a target location of a subject in a map.
  • Positioning techniques are widely used in various fields, such as an autonomous driving system.
  • a subject e.g., an autonomous vehicle
  • a pre-built map e.g., a High-definition map
  • the positioning techniques may be used to determine an accurate location of the autonomous vehicle by matching a local map generated by scanning data (e.g., point-cloud data) acquired by one or more sensors (e.g., a LiDAR) installed on the autonomous vehicle with the pre-built map.
  • Precision positioning of the subject relies on accurate matching of the local map with the pre-built map.
  • a system for positioning may include at least one storage medium storing a set of instructions and at least one processor in communication with the at least one storage medium.
  • the at least one processor may cause the system to obtain an estimated location of a subject.
  • the at least one processor may also cause the system to obtain point-cloud data associated with the estimated location.
  • the point-cloud data may be acquired by one or more sensors associated with the subject.
  • the at least one processor may further cause the system to generate, based on the point-cloud data, a local map associated with the estimated location.
  • the at least one processor may further cause the system to obtain, based on the estimated location, a reference map.
  • the at least one processor may still further cause the system to determine a target location of the subject by matching the local map with the reference map.
  • the at least one processor may also cause the system to match the local map with the reference map using a normalized cross-correlation (NCC) technique.
  • the at least one processor may further cause the system to determine, based on the matching between the local map and the reference map, the target location of the subject.
  • NCC normalized cross-correlation
  • the at least one processor may also cause the system to determine a similarity between the local map and each of the plurality of cells.
  • the at least one processor may further cause the system to determine, based on one of the plurality of cells having a maximum similarity with the local map, the target location of the subject.
  • the at least one processor may also cause the system to designate a location corresponding to the one of the plurality of cells having the maximum similarity with the local map as the target location.
  • the at least one processor may also cause the system to determine a plurality of locations within the plurality of cells. Each of the plurality of locations may correspond to one of the plurality of cells. The at least one processor may further cause the system to determine, based on the plurality of locations and the similarity corresponding to each of the plurality of cells, the target location of the subject.
  • the at least one processor may also cause the system to determine, based on intensity information presented in the local map and each of the plurality of cells, a first similarity between the local map and each of the plurality of cells.
  • the at least one processor may further cause the system to determine, based on elevation information presented in the local map and each of the plurality of cells, a second similarity between the local map and each of the plurality of cells.
  • the at least one processor may further cause the system to determine, based on the first similarity and the second similarity, the similarity between the local map and each of the plurality of cells.
  • the at least one processor may also cause the system to determine a first weight corresponding to the first similarity.
  • the at least one processor may further cause the system to determine a second weight corresponding to the second similarity.
  • the at least one processor may further cause the system to determine, based on the first weight, the second weight, the first similarity, and the second similarity, the similarity between the local map and each of the plurality of cells.
  • the at least one processor may also cause the system to process at least one of the local map or the point-cloud data using a first trained machine learning model.
  • the first trained machine learning model may be obtained by training a first machine learning model using a plurality of first training samples.
  • the at least one processor may also cause the system to denoise the at least one of the local map or the point-cloud data associated with the estimated location using the first trained machine learning model.
  • the at least one processor may also cause the system to remove one or more moving objects from the local map using the first trained machine learning model.
  • the at least one processor may also cause the system to calibrate, based on one or more pre-determined calibration parameters, the intensity information presented in the local map, the intensity information being associated with signals received by the one or more sensors.
  • the at least one processor may also cause the system to determine, based on data acquired by at least one of a GPS device, an inertial measurement unit (IMU) sensor, the estimated location of the subject.
  • IMU inertial measurement unit
  • the reference map may be generated off-line.
  • the reference map may be generated further by processing the reference map using a second trained machine learning model.
  • the second trained machine learning model may be obtained by training a second machine learning model using a plurality of second training samples.
  • a method may include one or more of the following operations performed by at least one processor.
  • the method may include obtaining an estimated location of a subject.
  • the method may also include obtaining point-cloud data associated with the estimated location, the point-cloud data being acquired by one or more sensors associated with the subject.
  • the method may also include generating, based on the point-cloud data, a local map associated with the estimated location.
  • the method may further include obtaining, based on the estimated location, a reference map.
  • the method may still further include determining a target location of the subject by matching the local map with the reference map.
  • a non-transitory computer readable medium may include at least one set of instructions for positioning. Wherein when executed by at least one processor, the at least one set of instructions may cause the at least one processor to perform a method.
  • the method may include obtaining an estimated location of a subject.
  • the method may also include obtaining point-cloud data associated with the estimated location, the point-cloud data being acquired by one or more sensors associated with the subject.
  • the method may also include generating, based on the point-cloud data, a local map associated with the estimated location.
  • the method may further include obtaining, based on the estimated location, a reference map.
  • the method may still further include determining a target location of the subject by matching the local map with the reference map.
  • a system for positioning may include an obtaining module, a generation module, and a determination module.
  • the obtaining module may be configured to obtain an estimated location of a subject.
  • the obtaining module may be configured to obtain point-cloud data associated with the estimated location.
  • the point-cloud data may be acquired by one or more sensors associated with the subject.
  • the generation module may be configured to generate, based on the point-cloud data, a local map associated with the estimated location.
  • the obtaining module may be configured to obtain, based on the estimated location, a reference map.
  • the determination module may be configured to determine a target location of the subject by matching the local map with the reference map.
  • FIG. 1 is a schematic diagram illustrating an exemplary autonomous driving system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device on which a terminal device may be implemented according to some embodiments of the present disclosure
  • FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure
  • FIG. 5 is a flowchart illustrating an exemplary process for positioning a subject according to some embodiments of the present disclosure
  • FIG. 6 is a flowchart illustrating an exemplary process for positioning a subject according to some embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an exemplary process for determining a similarity between a local map and a cell in a reference map according to some embodiments of the present disclosure.
  • FIG. 8 is a schematic diagram illustrating an exemplary process for determining a target location of a subject according to some embodiments of the present disclosure.
  • the flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
  • the systems and methods disclosed in the present disclosure are described primarily regarding positioning a subject (e.g., an autonomous vehicle) in an autonomous driving system, it should be understood that this is only one exemplary embodiment.
  • the systems and methods of the present disclosure may be applied to any other kind of transportation system.
  • the systems and methods of the present disclosure may be applied to transportation systems of different environments including land, ocean, aerospace, or the like, or any combination thereof.
  • the autonomous vehicle of the transportation systems may include a taxi, a private car, a hitch, a bus, a train, a bullet train, a high-speed rail, a subway, a vessel, an aircraft, a spaceship, a hot-air balloon, or the like, or any combination thereof.
  • the systems and methods may obtain an estimated location of a subject (e.g., an autonomous vehicle) .
  • the systems and methods may obtain point-cloud data associated with the estimated location.
  • the point-cloud data may be acquired by one or more sensors (e.g., a LiDAR) associated with the subject.
  • the systems and methods may generate, based on the point-cloud data, a local map associated with the estimated location.
  • the systems and methods may obtain, based on the estimated location, a reference map.
  • the systems and methods may determine a target location of the subject by matching the local map with the reference map. For example, the systems and methods may match the local map with the reference map by using a normalized cross-correlation (NCC) technique. Accordingly, the target location of the subject may be determined more accurately.
  • NCC normalized cross-correlation
  • FIG. 1 is a schematic diagram illustrating an exemplary autonomous driving system according to some embodiments of the present disclosure.
  • the autonomous driving system 100 may include a vehicle 110, a server 120, a terminal device 130, a storage device 140, a network 150, and a positioning and navigation system 160.
  • the vehicles 110 may carry a passenger and travel to a destination.
  • the vehicles 110 may include a plurality of vehicles 110-1, 110-2... 110-n.
  • the vehicles 110 may be any type of autonomous vehicles.
  • An autonomous vehicle may be capable of sensing its environment and navigating without human maneuvering.
  • the vehicle (s) 110 may include structures of a conventional vehicle, for example, a chassis, a suspension, a steering device (e.g., a steering wheel) , a brake device (e.g., a brake pedal) , an accelerator, etc.
  • the vehicle (s) 110 may be a survey vehicle configured for acquiring data for constructing a high-definition map or 3-D city modeling (e.g., a reference map as described elsewhere in the present disclosure) . It is contemplated that vehicle (s) 110 may be an electric vehicle, a fuel cell vehicle, a hybrid vehicle, a conventional internal combustion engine vehicle, etc.
  • vehicle (s) 110 may have a body and at least one wheel. The body may be any body style, such as a sports vehicle, a coupe, a sedan, a pick-up truck, a station wagon, a sports utility vehicle (SUV) , a minivan, or a conversion van.
  • SUV sports utility vehicle
  • the vehicle (s) 110 may include a pair of front wheels and a pair of rear wheels. However, it is contemplated that the vehicle (s) 110 may have more or less wheels or equivalent structures that enable the vehicle (s) 110 to move around.
  • the vehicle (s) 110 may be configured to be all wheel drive (AWD) , front wheel drive (FWR) , or rear wheel drive (RWD) .
  • the vehicle (s) 110 may be configured to be operated by an operator occupying the vehicle, remotely controlled, and/or autonomous.
  • the vehicle (s) 110 may be equipped with one or more sensors 112 mounted to the body of the vehicle (s) 110 via a mounting structure.
  • the mounting structure may be an electro-mechanical device installed or otherwise attached to the body of the vehicle (s) 110. In some embodiments, the mounting structure may use screws, adhesives, or another mounting mechanism.
  • the vehicle (s) 110 may be additionally equipped with the one or more sensors 112 inside or outside the body using any suitable mounting mechanisms.
  • the sensors 112 may include a GPS device, a light detection and ranging (LiDAR) , a camera, an inertial measurement unit (IMU) sensor, or the like, or any combination thereof.
  • the LiDAR may be configured to scan the surrounding and generate point-cloud data.
  • the LiDAR may measure a distance to an object by illuminating the object with pulsed laser light and measuring the reflected pulses with a sensor. Differences in laser return times and wavelengths may then be used to make digital 3-D representations of the object.
  • the light used for LiDAR scan may be ultraviolet, visible, near infrared, etc. Because a narrow laser beam may map physical features with very high resolution, the LiDAR may be particularly suitable for high-definition map surveys.
  • the camera may be configured to obtain one or more images relating to objects (e.g., a person, an animal, a tree, a roadblock, building, or a vehicle) that are within the scope of the camera.
  • the GPS device may refer to a device that is capable of receiving geolocation and time information from GPS satellites and then to calculate the device's geographical position.
  • the IMU sensor may refer to an electronic device that measures and provides a vehicle’s specific force, angular rate, and sometimes the magnetic field surrounding the vehicle, using various inertial sensors, such as accelerometers and gyroscopes, sometimes also magnetometers.
  • the sensor 112 can provide real-time pose information of the vehicle (s) 110 as it travels, including the positions and orientations (e.g., Euler angles) of the vehicle (s) 110 at each time point. Consistent with the present disclosure, the sensors 112 may take measurements of pose information at the same time points where the sensors 112 captures the point cloud data. Accordingly, the pose information may be associated with the respective point cloud data. In some embodiments, the combination of a point cloud data and its associated pose information may be used to position the vehicle (s) 110.
  • the server 120 may be a single server or a server group.
  • the server group may be centralized or distributed (e.g., the server 120 may be a distributed system) .
  • the server 120 may be local or remote.
  • the server 120 may access information and/or data stored in the terminal device 130, the sensors 112, the vehicle 110, the storage device 140, and/or the positioning and navigation system 160 via the network 150.
  • the server 120 may be directly connected to the terminal device 130, the sensors 112, the vehicle 110, and/or the storage device 140 to access stored information and/or data.
  • the server 120 may be implemented on a cloud platform or an onboard computer.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the server 120 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2 in the present disclosure.
  • the server 120 may include a processing engine 122.
  • the processing engine 122 may process information and/or data associated with the vehicle 110 to perform one or more functions described in the present disclosure. For example, the processing engine 122 may obtain an estimated location of the vehicle 110. As another example, the processing engine 122 may obtain point-cloud data associated with the estimated location of the vehicle 110. As still another example, the processing engine 122 may generate a local map associated with the estimated location based on the point-cloud data. As still another example, the processing engine 122 may obtain a reference map based on the estimated location. As still another example, the processing engine 122 may determine a target location of the vehicle 110 by matching the local map with the reference map.
  • the processing engine 122 may include one or more processing engines (e.g., single-core processing engine (s) or multi-core processor (s) ) .
  • the processing engine 122 may include a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a digital signal processor (DSP) , a field programmable gate array (FPGA) , a programmable logic device (PLD) , a controller, a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • ASIP application-specific instruction-set processor
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • PLD programmable logic device
  • controller
  • the server 120 may be connected to the network 150 to communicate with one or more components (e.g., the terminal device 130, the sensors 112, the vehicle 110, the storage device 140, and/or the positioning and navigation system 160) of the autonomous driving system 100.
  • the server 120 may be directly connected to or communicate with one or more components (e.g., the terminal device 130, the sensors 112, the vehicle 110, the storage device 140, and/or the positioning and navigation system 160) of the autonomous driving system 100.
  • the server 120 may be integrated in the vehicle 110.
  • the server 120 may be a computing device (e.g., a computer) installed in the vehicle 110.
  • the terminal devices 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, a built-in device in a vehicle 130-4, a smart watch 130-5, or the like, or any combination thereof.
  • the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
  • the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
  • the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
  • the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a Google TM Glass, an Oculus Rift, a HoloLens, a Gear VR, etc.
  • the built-in device in the vehicle 130-4 may include an onboard computer, an onboard television, etc.
  • the server 120 may be integrated into the terminal device 130.
  • the storage device 140 may store data and/or instructions.
  • the storage device 140 may store data obtained from the terminal device 130, the sensors 112, the vehicle 110, the positioning and navigation system 160, the processing engine 122, and/or an external storage device.
  • the storage device 140 may store an estimated location of the vehicle 110 received from the sensors 112 (e.g., a GPS device, an IMU sensor) .
  • the storage device 140 may store point-cloud data associated with the estimated location received from the sensors 112 (e.g., a LiDAR) .
  • the storage device 140 may store a local map generated by the processing engine 122.
  • the storage device 140 may store a reference map obtained from an external storage device.
  • the storage device 140 may store data and/or instructions that the server 120 may execute or use to perform exemplary methods described in the present disclosure.
  • the storage device 140 may store instructions that the processing engine 122 may execute or use to generate, based on point-cloud data, a local map associated with an estimated location.
  • the storage device 140 may store instructions that the processing engine 122 may execute or use to determine a target location of the vehicle 110 by matching a local map with a reference map.
  • the storage device 140 may include a mass storage, a removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc.
  • Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memory may include a random access memory (RAM) .
  • Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyrisor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically-erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • the storage device 140 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage device 140 may be connected to the network 150 to communicate with one or more components (e.g., the server 120, the terminal device 130, the sensors 112, the vehicle 110, and/or the positioning and navigation system 160) of the autonomous driving system 100.
  • One or more components of the autonomous driving system 100 may access the data or instructions stored in the storage device 140 via the network 150.
  • the storage device 140 may be directly connected to or communicate with one or more components (e.g., the server 120, the terminal device 130, the sensors 112, the vehicle 110, and/or the positioning and navigation system 160) of the autonomous driving system 100.
  • the storage device 140 may be part of the server 120.
  • the storage device 140 may be integrated in the vehicle 110.
  • the network 150 may facilitate exchange of information and/or data.
  • one or more components e.g., the server 120, the terminal device 130, the sensors 112, the vehicle 110, the storage device 140, or the positioning and navigation system 160
  • the server 120 may send information and/or data to other component (s) of the autonomous driving system 100 via the network 150.
  • the server 120 may obtain/acquire an estimated location of a subject (e.g., the vehicle 110) from the sensors 112 and/or the positioning and navigation system 160 via the network 150.
  • the server 120 may obtain/acquire point-cloud data associated with the estimated location of the subject (e.g., the vehicle 110) from the sensors 112 via the network 150.
  • the server 120 may obtain/acquire, based on the estimated location, a reference map from the storage device 140 via the network 150.
  • the network 150 may be any type of wired or wireless network, or combination thereof.
  • the network 150 may include a cable network, a wireline network, an optical fiber network, a tele communications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a wide area network (WAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a near field communication (NFC) network, or the like, or any combination thereof.
  • LAN local area network
  • WAN wide area network
  • WLAN wireless local area network
  • MAN metropolitan area network
  • WAN wide area network
  • PSTN public telephone switched network
  • Bluetooth network a Bluetooth network
  • ZigBee network a near field communication (NFC) network
  • the network 150 may include one or more network access points.
  • the network 150 may include wired or wireless network access points (e.g., 150-1, 150-2) , through which one or more components of the autonomous driving system 100 may be connected to the network 150 to exchange data and/or information.
  • the positioning and navigation system 160 may determine information associated with an object, for example, one or more of the terminal devices 130, the vehicle 110, etc.
  • the positioning and navigation system 160 may be a global positioning system (GPS) , a global navigation satellite system (GLONASS) , a compass navigation system (COMPASS) , a BeiDou navigation satellite system, a Galileo positioning system, a quasi-zenith satellite system (QZSS) , etc.
  • the information may include a location, an elevation, a velocity, or an acceleration of the object, or a current time.
  • the positioning and navigation system 160 may include one or more satellites, for example, a satellite 160-1, a satellite 160-2, and a satellite 160-3.
  • the satellites 160-1 through 160-3 may determine the information mentioned above independently or jointly.
  • the satellite positioning and navigation system 160 may send the information mentioned above to the network 150, the terminal device 130, or the vehicle 110 via wireless connections.
  • the autonomous driving system 100 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure.
  • the autonomous driving system 100 may further include a database, an information source, etc.
  • the autonomous driving system 100 may be implemented on other devices to realize similar or different functions.
  • the GPS device may also be replaced by other positioning device, such as BeiDou.
  • BeiDou other positioning device
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure.
  • the server 120 may be implemented on the computing device 200.
  • the processing engine 122 may be implemented on the computing device 200 and configured to perform functions of the processing engine 122 disclosed in this disclosure.
  • the computing device 200 may be used to implement any component of the autonomous driving system 100 of the present disclosure.
  • the processing engine 122 of the autonomous driving system 100 may be implemented on the computing device 200, via its hardware, software program, firmware, or a combination thereof.
  • the computer functions related to the autonomous driving system 100 as described herein may be implemented in a distributed manner on a number of similar platforms to distribute the processing load.
  • the computing device 200 may include communication (COMM) ports 250 connected to and from a network (e.g., the network 150) connected thereto to facilitate data communications.
  • the computing device 200 may also include a processor (e.g., a processor 220) , in the form of one or more processors (e.g., logic circuits) , for executing program instructions.
  • the processor may include interface circuits and processing circuits therein.
  • the interface circuits may be configured to receive electronic signals from a bus 210, wherein the electronic signals encode structured data and/or instructions for the processing circuits to process.
  • the processing circuits may conduct logic calculations, and then determine a conclusion, a result, and/or an instruction encoded as electronic signals. Then the interface circuits may send out the electronic signals from the processing circuits via the bus 210.
  • the exemplary computing device 200 may further include program storage and data storage of different forms, for example, a disk 270, and a read only memory (ROM) 230, or a random access memory (RAM) 240, for various data files to be processed and/or transmitted by the computing device 200.
  • the exemplary computing device 200 may also include program instructions stored in the ROM 230, the RAM 240, and/or other type of non-transitory storage medium to be executed by the processor 220.
  • the methods and/or processes of the present disclosure may be implemented as the program instructions.
  • the computing device 200 also includes an I/O component 260, supporting input/output between the computing device 200 and other components therein.
  • the computing device 200 may also receive programming and data via network communications.
  • the computing device 200 in the present disclosure may also include multiple processors, and thus operations that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors.
  • the processor of the computing device 200 executes both operation A and operation B.
  • operation A and operation B may also be performed by two different processors jointly or separately in the computing device 200 (e.g., the first processor executes operation A and the second processor executes operation B, or the first and second processors jointly execute operations A and B) .
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device on which a terminal device may be implemented according to some embodiments of the present disclosure.
  • the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and storage 390.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • a mobile operating system 370 e.g., iOS TM , Android TM , Windows Phone TM
  • one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340.
  • the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to positioning or other information from the processing engine 122.
  • User interactions with the information stream may be achieved via the I/O 350 and provided to the processing engine 122 and/or other components of the autonomous driving system 100 via the network 150.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device.
  • PC personal computer
  • a computer may also act as a server if appropriately programmed.
  • FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure.
  • the processing engine 122 may include an obtaining module 410, a generation module 420, a determination module 430, and a storage module 440.
  • the obtaining module 410 may be configured to obtain data and/or information associated with the autonomous driving system 100. For example, the obtaining module 410 may obtain an estimated location (e.g., a geographic location) of a subject (e.g., an autonomous vehicle) . As another example, the obtaining module 410 may obtain point-cloud data associated with an estimated location of a subject. As still another example, the obtaining module 410 may obtain a reference map based on an estimated location of a subject.
  • an estimated location e.g., a geographic location
  • a subject e.g., an autonomous vehicle
  • the obtaining module 410 may obtain point-cloud data associated with an estimated location of a subject.
  • the obtaining module 410 may obtain a reference map based on an estimated location of a subject.
  • the generation module 420 may be configured to generate data and/or information associated with the autonomous driving system 100. For example, the generation module 420 may generate a local map associated with an estimated location of a subject based on point-cloud data. More descriptions of the generation of the local map may be found elsewhere in the present disclosure (e.g., FIG. 5, and descriptions thereof) .
  • the determination module 430 may be configured to determine data and/or information associated with the autonomous driving system 100.
  • the determination module 430 may determine a target location of a subject by matching a local map with a reference map. For example, the determination module 430 may determine a similarity between a local map and each of a plurality of cells in a reference map using a normalized cross-correlation (NCC) technique.
  • NCC normalized cross-correlation
  • the determination module 430 may determine one of the plurality of cells that matches with the local map based on the similarity between the local map and each of the plurality of cells.
  • the determination module 430 may determine, based on the one of the plurality of cells that matches with the local map, the target location of the subject.
  • the storage module 440 may be configured to store data and/or information associated with the autonomous driving system 100.
  • the storage module 440 may store an estimated location of a subject.
  • the storage module 440 may store point-data associated with an estimated location of a subject.
  • the storage module 440 may store a local map associated with an estimated location of a subject.
  • the storage module 440 may store a reference map associated with an estimated location of a subject.
  • the storage module 440 may store a target location of a subject.
  • the modules in the processing engine 122 may be connected to or communicate with each other via a wired connection or a wireless connection.
  • the wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof.
  • the wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof.
  • LAN Local Area Network
  • WAN Wide Area Network
  • NFC Near Field Communication
  • Two or more of the modules may be combined into a single module, and any one of the modules may be divided into two or more units.
  • one or more modules may be omitted.
  • the storage module 440 may be omitted.
  • one or more modules may be combined into a single module.
  • the generation module 420 and the determination module 430 may be combined into a single module.
  • FIG. 5 is a flowchart illustrating an exemplary process for positioning a subject according to some embodiments of the present disclosure.
  • the process 500 may be executed by the autonomous driving system 100.
  • the process 500 may be implemented as a set of instructions stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 500.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 500 as illustrated in FIG. 5 and described below is not intended to be limiting.
  • the processing engine 122 may obtain an estimated location of a subject.
  • the subject may be any composition of organic and/or inorganic matters that are with or without life and located on earth.
  • the subject may be an autonomous vehicle (e.g., the vehicle 110) as described elsewhere in the present disclosure (FIG. 1, and descriptions thereof) .
  • the estimated location of the subject may be a geographic location where the subject locates.
  • the geographic location may be denoted by geographic coordinates (e.g., longitudinal and latitudinal coordinates) of the location.
  • the processing engine 122 may obtain the geographic location of the subject from one or more components of the autonomous driving system 100.
  • the subject may be associated with a sensor (e.g., the sensors 112) with positioning function, and the processing engine 122 may obtain the geographic coordinates of the subject from the sensor.
  • the processing engine 122 may obtain the geographic coordinates of the subject via a GPS device and/or an inertial measurement unit (IMU) sensor mounted on the subject as described elsewhere in the present disclosure (e.g., FIG. 1, and descriptions thereof) .
  • IMU inertial measurement unit
  • the processing engine 122 may continuously or periodically obtain geographic coordinates of the subject from the sensor (e.g., GPS device) . Additionally or alternatively, the sensor with positioning function (e.g., GPS device) may transmit the geographic coordinates of the subject to a storage (e.g., the storage device 140) of the autonomous driving system 100 via the network 150 continuously or periodically. The processing engine 122 may access the storage and retrieve one or more geographic coordinates of the subject.
  • the sensor e.g., GPS device
  • the sensor with positioning function e.g., GPS device
  • the processing engine 122 may access the storage and retrieve one or more geographic coordinates of the subject.
  • the processing engine 122 may obtain point-cloud data associated with the estimated location.
  • the point-cloud data may be generated by a sensor (e.g., LiDAR) via emitting laser pulse for scanning a space around the estimated location of the subject.
  • the processing engine 122 may obtain the point-cloud data associated with the estimated location from one or more sensors (e.g., the sensors 112) associated with the subject, a storage (e.g., the storage device 140) .
  • the one or more sensors may include a LiDAR as described elsewhere in the present disclosure (e.g., FIG. 1, and descriptions thereof) .
  • one or more LiDARs may be mounted on the subject (e.g., the vehicle 110) to send laser pulses to the earth’s surface and/or surrounding objects (e.g., buildings, pedestrians, other vehicles) .
  • the laser may return back to the one or more LiDARs.
  • the one or more LiDARs may generate point-cloud data based on received information.
  • the subject e.g., the vehicle 110
  • the one or more LiDARs mounted on the subject may rotate a certain degrees (e.g., 120 degree, 360 degrees) multiple times in a second to continuously generate the point-cloud data.
  • the point-cloud data may refer to a set of data points associated with one or more objects in the space around the estimation location of the subject (e.g., a vehicle) .
  • a point may correspond to a point or region of the object.
  • the one or more objects around the subject may include a lane mark, a building, a pedestrian, an animal, a plant, a vehicle, or the like.
  • the point-cloud data may have a plurality of attributes.
  • the attribute of the point-cloud may include point-cloud coordinates (e.g., X, Y and Z coordinates) of each data point in a 3D point-cloud coordinate system, elevation information associated with each data point, intensity information associated with each data point, a return number, a total count of returns, a classification of each data point, a scan direction, or the like, or any combination thereof.
  • elevation information associated with a data point may refer to height of the data point above or below a fixed reference point, line or plane (e.g., most commonly a reference geoid, a mathematical model of the Earth's sea level as an equipotential gravitational surface) .
  • “Intensity information associated with a data point” may refer to return strength of the laser pulse emitted from the sensor (e.g., LiDAR) and reflected by the object for generating the data point.
  • “Return number” may refer to the pulse return number for a given output laser pulse emitted from the sensor (e.g., LiDAR) and reflected by the object.
  • an emitted laser pulse may have various levels of returns depending on features it is reflected from and capabilities of the sensor (e.g., a laser scanner) used to collect the point-cloud data. For example, the first return may be flagged as return number one, the second as return number two, and so on. “The total count of returns” may refer to the total number of returns for a given pulse.
  • Classification of a data point may refer to a type of data point (or the object) that has reflected the laser pulse.
  • the set of data points may be classified into a number of categories including bare earth or ground, a building, a person, water, etc.
  • Scan direction may refer to the direction in which a scanning mirror in the LiDAR was directed when a data point was detected.
  • the point-cloud data may include a plurality of point cloud frames.
  • Each point cloud frame may correspond to a time point or a time period.
  • “atime point or a time period corresponding to a point cloud frame” may refer to the time when the LiDAR generates the point cloud frame.
  • the processing engine 122 may generate, based on the point-cloud data, a local map associated with the estimated location.
  • a local map may refer to a set of point-clouds in a region with the estimated location of the subject as the center.
  • the shape of the region may be a regular triangle, a rectangle, a square, a regular hexagon, a circle, or the like.
  • the size of the local map may be M meters ⁇ M meters. M may be any positive number, for example, 5, 10, 20, 50, 100, 500, etc.
  • the processing engine 122 may divide the local map into a plurality of grids. The size of each grid may be K meters ⁇ K meters. K may be any positive number less than M, for example, 0.1, 0.2, 0.5, 1, etc.
  • the processing engine 122 may register and/or stitch a plurality of point cloud frames in the point-cloud data for generating a local map using a registration technique.
  • point cloud registration and/or stitching may refer to a process of associating a plurality of point cloud frames into a common coordinate system.
  • the processing engine 122 may register and/or stitch a plurality of point cloud frames based on one or more point cloud registration algorithms.
  • An exemplary point cloud registration algorithm may include an iterative closest point (ICP) algorithm, a robust point matching (RPM) algorithm, a Kernel correlation (KC) algorithm, a coherent point drift (CPD) algorithm, a sorting the correspondence space (SCS) algorithm, or the like.
  • the processing engine 122 may determine the local map based on the registered and/or stitched point cloud data. For example, the processing engine 122 may map the 3D registered and/or stitched point cloud data into a 2D local map with the estimated location of the subject as the center. The processing engine 122 may determine intensity information and/or elevation information associated with each grid of the plurality of grids in the local map based on the intensity information and/or elevation information associated with each data point of the point-cloud data and the geographic coordinates of the each data point.
  • the processing engine 122 may perform a preprocessing operation on the point-cloud data and generate the local map based on the preprocessed point-cloud data and/or perform a post-processing operation on the local map.
  • Exemplary preprocessing operations and/or the post-processing operation may include a denoising operation, remove of movement objects, a calibration of the point-cloud data (e.g., calibration of intensity information in the point-cloud data) , etc.
  • the processing engine 122 may process at least one of the local map or the point-cloud data using a first trained machine learning model. For example, the processing engine 122 may denoise the at least one of the local map or the point-cloud data associated with the estimated location using the first trained machine learning model.
  • a local map and/or point-cloud data may refer to a process of correcting the local map and/or the point-cloud data when it has outlier and surface noise.
  • the noise of the point-cloud may come from a hardware (e.g., due to inherent limitations of the LiDAR) , a software (e.g., in the case of generating point-cloud data from algorithms, data points may locate somewhere completely wrong due to imprecise triangulation) , environmental (e.g., due to the surrounding contamination, such as dust in the air) causes, etc.
  • the processing engine 122 may remove one or more moving objects from the reference map using the first trained machine learning model.
  • a moving object may refer to that the condition of the subject (e.g., a location of the object, a shape of the object) may change over time, which may not need to be drawn on the local map.
  • the moving object may include a plant (e.g., a tree, a flower) , a vehicle, a pedestrian, an animal, or the like, or any combination thereof.
  • the calibration of the point-cloud data may include a geometric calibration of the sensor (e.g., LiDAR) , an intensity information calibration, etc.
  • the processing engine 122 may perform a geometric calibration on the LiDAR.
  • the intensity information associated with the point-cloud data acquired by the LiDAR may be affected by a plurality of parameters, including a transmittal power, a transmittal range, an angle of incidence, an atmospheric transmittance, a beam divergence, detector responsivity, etc.
  • the processing engine 122 may perform a radiometric calibration on the intensity information presented in the local map or the point-cloud data.
  • the geometric calibration of the LiDAR may be used to estimate and remove all the systematic errors from the point-cloud data such that only random errors are left.
  • the systematic errors in LiDAR data may be caused by biases in system parameters, e.g., biases in the mounting parameters relating the system components (e.g., a lever arm and a boresight angle of the LiDAR) and biases in the measured ranges and mirror angles.
  • the processing engine 122 may perform the geometric calibration based on one or more geometric calibration algorithms, for example, a system-driven algorithm, and/or a data-driven algorithm.
  • the radiometric calibration of the LiDAR may be used to convert the recorded intensity information into the spectral reflectance of an object.
  • the processing engine 122 may calibrate the point-cloud data using the first trained machine learning model.
  • the processing engine 122 may calibrate the intensity information presented in the local map based on one or more pre-determined calibration parameters.
  • the pre-determined calibration parameters may be set manually by an operator, or be determined by one or more components of the autonomous driving system 100.
  • the one or more pre-calibration parameters may be provided by a manufacture of the sensor (e.g., LiDAR) .
  • the processing engine 122 may calibrate the intensity information presented in the local map based on a radar equation.
  • the radar equation may consider the effects of the measured laser range, an angle of reflection, and an atmospheric attenuation, to retrieve the surface reflectance in the near infrared red (NIR) spectrum.
  • NIR near infrared red
  • the first trained machine learning model may be constructed based on an artificial neural network, a support vector machine (SVM) model, a Bayesian network, a genetic model, or the like, or any combination thereof.
  • the processing engine 122 may determine the first trained machine learning model by training a first machine learning model.
  • the processing engine 122 may obtain a plurality of first training samples.
  • the plurality of first training sample may include sample maps with noise and corresponding sample maps without noise.
  • the plurality of first training samples may include sample maps with moving objects and corresponding sample maps without the moving objects.
  • the processing engine 122 may extract one or more first sample features with respect to each of the plurality of first training samples.
  • the one or more first sample features may include a color feature, a texture feature, a shape feature, a spatial relationship feature, an intensity feature, an elevation feature, or the like, or any combination thereof.
  • the processing engine 122 may determine the one or more first sample features with respect to each of the plurality of first training samples as training data.
  • the processing engine 122 may generate the first trained machine learning model based on the training data according to a training process.
  • the training process may include one or more iterations.
  • the processing engine 122 may generate a first candidate machine learning model.
  • the processing engine 122 may finish the iterations until a loss function of the generated first candidate machine learning model converges to a threshold.
  • the threshold may be a predetermined value stored in a storage device (e.g., the storage device 140) , and/or a dynamic threshold.
  • the processing engine 122 may obtain the first trained machine learning model from a storage device in the autonomous driving system 100 (e.g., the storage device 140) and/or an external data source (not shown) via the network 150.
  • the first trained machine learning model may be pre-trained (by the processing engine 122 or any other platforms or devices) and stored in the storage device in the autonomous driving system 100.
  • the processing engine 122 may access the storage device and retrieve the first trained machine learning model.
  • the processing engine 122 may obtain, based on the estimated location, a reference map.
  • a specific reference map may present a plurality of attributes of objects as described elsewhere in the present disclosure in a specific space around a specific reference location.
  • the specific reference location may be the center of the reference map.
  • the geographic coordinates corresponding to the specific reference location or other locations of objects may be pre-determined and stored in a storage (e.g., the storage device 140) .
  • the reference map may be generated off-line.
  • a vehicle e.g., the vehicle 110
  • a plurality of sensors with high accuracy e.g., LiDAR
  • a processing engine e.g., the processing engine 122
  • Each of the plurality of high-definition maps may correspond to one or more specific locations.
  • the processing engine 122 may access the storage device (e.g., the storage device 140) and retrieve a corresponding reference map from the plurality of high-definition maps based on the estimated location of the subject.
  • the generation of the reference map may be the same as or be different form the generation of the local map.
  • the reference map may be generated using one or more same or different point cloud registration algorithms with the local map.
  • the size of the reference map may be greater than the size of the local map.
  • the reference map may be 50 meters ⁇ 50 meters, and the local map may be 1 meter ⁇ 1 meter.
  • the processing engine 122 may process the reference map and/or the reference point-cloud data using a second trained machine learning model. For example, the processing engine 122 may denoise the reference map and/or the reference point-cloud data using the second trained machine learning model as described in connection with operation 530. As another example, the processing engine 122 may remove one or more moving objects from the reference map and/or the reference point-cloud data using the second trained machine learning model as described in connection with operation 530. As still another example, the processing engine 122 may calibrate the intensity information presented in the reference point-cloud data using the second trained machine learning model.
  • the second trained machine learning model may be same as or different from the first machine learning model. For example, the second machine learning may be constructed based on an artificial neural network, a support vector machine (SVM) model, a Bayesian network, a genetic model, or the like, or any combination thereof.
  • SVM support vector machine
  • the processing engine 122 may determine the second trained machine learning model by training a second machine learning model. In some embodiments, the processing engine 122 may obtain a plurality of second training samples. The plurality of second training samples may be same as or different from the plurality of first training samples. For example, the plurality of second training samples may include sample maps with moving objects and corresponding sample maps without the moving objects. As another example, the plurality of second training sample may include sample maps with noises and corresponding sample maps without noises. As still another example, the plurality of second training sample may include sample maps with original intensity information and corresponding sample maps with calibrated intensity information. The second trained machine learning model may be generated by training a second machine learning model as described in connection with the training of the first machine learning model. The first machine learning model may be same as or different from the second machine learning model.
  • the processing engine 122 may determine a target location of the subject by matching the local map with the reference map.
  • the reference map may include a plurality of cells (also referred to as sub-reference maps) having a same size with the local map.
  • the matching between the local map and the reference map may refer to determine one of the plurality of cells of the reference map that has a similarity with the local map satisfying a condition.
  • the processing engine 122 may determine a similarity between the local map and each of the plurality of cells in the reference map using a pattern-matching technique. For example, the processing engine 122 may determine the similarity between the local map and each of the plurality of cells in the reference map using a normalized cross-correlation (NCC) technique. The processing engine 122 may determine one of the plurality of cells that matches with the local map based on the similarity between the local map and each of the plurality of cells in the reference map.
  • NCC normalized cross-correlation
  • the processing engine 122 may determine the target location of the subject based on the one of the plurality of cells (e.g., having a maximum similarity) that matches with the local map. For example, the processing engine 122 may designate a location (e.g., a center point) corresponding to the one of the plurality of cells having the maximum similarity with the local map as the target location. In some embodiments, the processing engine 122 may determine a plurality of locations within the plurality of cells. Each of the plurality of locations may correspond to one of the plurality of cells. The processing engine 122 may determine, based on the plurality of locations and the similarity corresponding to each of the plurality of cells, the target location of the subject. The target location of the subject may be more accurate than the estimation location of the subject obtained in operation 510. More descriptions of the determination of the target location of the subject may be found elsewhere in the present disclosure (e.g., FIG. 6, FIG. 7, and descriptions thereof) .
  • one or more other optional operations may be added elsewhere in the exemplary process 500.
  • the processing engine 122 may store information and/or data associated with the local map and/or the reference map in a storage (e.g., the storage device 140) disclosed elsewhere in the present disclosure.
  • the order of the operations in process 500 may be changed. For example, operation 540 may be performed before operation 520.
  • FIG. 6 is a flowchart illustrating an exemplary process for positioning a subject according to some embodiments of the present disclosure.
  • the process 600 may be executed by the autonomous driving system 100.
  • the process 600 may be implemented as a set of instructions stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 600.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 600 as illustrated in FIG. 6 and described below is not intended to be limiting.
  • the processing engine 122 may determine a similarity between a local map and each of a plurality of cells in a reference map using a normalized cross-correlation (NCC) technique.
  • NCC normalized cross-correlation
  • the local map and/or the reference map may be obtained as described in connection with operations 510 to 540 illustrated in FIG. 5.
  • the processing engine 122 may determine a plurality of cells in the reference map according to the size of the local map. For example, the processing engine 122 may determine the plurality of cells by sliding a window on the reference map.
  • the size (e.g., a length, a width) of each cell may be the same as the size (e.g., a length, a width) of the local map.
  • the size of the local map may be 1 meter ⁇ 1meter, and the size of a cell may be 1 meter ⁇ 1 meter.
  • a similarity between the local map and a cell in the reference map may be used to evaluate the similarity between attribute information of one or more objects (e.g., intensity information and/or elevation information) presented in the local map and attribute information of one or more objects (e.g., intensity information and/or elevation information) presented in the cell in the reference map.
  • the processing engine 122 may determine the similarity between the local map and each of the plurality of cells in the reference map using a normalized cross-correlation (NCC) technique.
  • NCC normalized cross-correlation
  • the NCC may refer to a correlation measure for determining a similarity between points in two or more images.
  • the similarity between the local map and a cell in the reference map may be determined according to Equation (1) :
  • R (x, y) refers to a similarity between a local map and a cell in a reference map
  • (x, y) refers to coordinates of a base point of the cell in the reference map, 0 ⁇ x ⁇ (M-N) , 0 ⁇ y ⁇ (M-N)
  • (x′, y′) refers to coordinates of a point in the local map
  • T (x′, y′) refers to information (e.g., intensity information and/or elevation information) associated with the point (x′, y′) in the local map
  • I (x+x′, y+y′) refers to information (e.g., intensity information and/or elevation information) associated with the point (x+x′, y+y′) in the cell in the reference map.
  • the base point of the cell in the reference map may be a top-left point of the cell in the reference map. Accordingly, the processing engine 122 may determine a (M-N) ⁇ (M-N) similarity distribution map (also referred to as a probability map) .
  • the processing engine 122 may determine one of the plurality of cells that matches with the local map based on the similarity between the local map and each of the plurality of cells.
  • the processing engine 122 may select one cell from the plurality of cells as a target cell that matches with the local map (also referred to as a target cell) based on the similarity between the local map and each of the plurality of cells in the reference map. For example, the processing engine 122 may determine a cell having the maximum similarity with the local map as the target cell.
  • the processing engine 122 may determine, based on the one of the plurality of cells that matches with the local map, a target location of the subject.
  • the processing engine 122 may designate a location in the target cell as the target location of the subject. For example, the processing engine 122 may determine a center point of the target cell as the target location of the subject.
  • the processing engine 122 may determine a plurality of locations within the plurality of cells. Each of the plurality of locations may correspond to one of the plurality of cells. The processing engine 122 may determine, based on the plurality of locations and the similarity corresponding to each of the plurality of cells, the target location of the subject. In some embodiments, the processing engine 122 may determine a weight for each of the plurality of locations by normalizing a plurality of similarities between the local map and the plurality of cells corresponding to the plurality of locations. The processing engine 122 may determine weighted arithmetic mean coordinates of a plurality of coordinates of the plurality of locations in the reference map. For example, the processing engine 122 may determine the weighted arithmetic mean coordinates according to Equation (2) :
  • (x, y) refers to the coordinates of the target location; (x 1 , y 1 ) , (x 2 , y 2 ) ... (x n , y n ) refer to coordinates of the plurality of locations in the plurality of cells in the reference map; n may refer to the number of locations; P i refers to the weight for a location (x i , y i ) in the reference map.
  • the processing engine 122 may determine a location corresponding to the weighted arithmetic mean coordinates as the target location of the subject.
  • the processing engine 122 may fit the plurality of similarities between the local map and the plurality of cells in the reference map. For example, the processing engine 122 may fit the plurality of similarities between the local map and the plurality of cells in the reference map according to a curve fitting method (e.g., a least square method) . The processing engine 122 may determine a location corresponding to a peak of a fitting curve as the target location of the subject.
  • a curve fitting method e.g., a least square method
  • geographic coordinates of each of a plurality of location in the reference map may be pre-determined.
  • the processing engine 122 may obtain the geographic coordinates of the target location in the reference may from a storage (e.g., the storage device 140) .
  • the processing engine 122 may transform the target location of the subject in the reference map to a geographic location of the subject in a geographic coordinate system.
  • a position of a geographic location on the earth's surface may be represented by a latitude coordinate and a longitude coordinate.
  • a relationship between coordinates of each of a plurality of locations in the reference map, and latitude and longitude coordinates of the each of the plurality of locations may be stored in a storage device of the autonomous driving system 100.
  • the processing engine 122 may determine corresponding latitude and longitude coordinates based on the relationship between coordinates of each of a plurality of locations in the reference map and latitude and longitude coordinates of the each of the plurality of locations, and the coordinates of the target location in the reference map.
  • one or more other optional operations may be added elsewhere in the exemplary process 600.
  • the processing engine 122 may store information and/or data associated with the reference map (e.g., the similarity between the local map and each of the plurality of cells in the reference map) in a storage (e.g., the storage device 140) disclosed elsewhere in the present disclosure.
  • FIG. 7 is a flowchart illustrating an exemplary process for determining a similarity between a local map and a cell in a reference map according to some embodiments of the present disclosure.
  • the process 700 may be executed by the autonomous driving system 100.
  • the process 700 may be implemented as a set of instructions stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 700.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 700 as illustrated in FIG. 7 and described below is not intended to be limiting.
  • the processing engine 122 may determine, based on intensity information presented in a local map and each of a plurality of cells, a first similarity between the local map and each of the plurality of cells in a reference map. In some embodiments, the processing engine 122 may determine the first similarity between the local map and each of the plurality of cells in the reference map according to Equation (1) as described in connection with operation 610.
  • the processing engine 122 may determine, based on elevation information presented in the local map and each of the plurality of cells, a second similarity between the local map and each of the plurality of cells in the reference map. In some embodiments, the processing engine 122 may determine the second similarity between the local map and each of the plurality of cells in the reference map according to Equation (1) as described in connection with operation 610.
  • the processing engine 122 may determine, based on the first similarity and the second similarity, a similarity between the local map and each of the plurality of cells in the reference map.
  • the processing engine 122 may weight the first similarity and the second similarity to determine a target similarity between a cell and the local map. For example, the processing engine 122 may determine a first weight corresponding to the first similarity. The processing engine 122 may determine a second weight corresponding to the second similarity. The first weight (or the second weight) corresponding to the first similarity (or second similarity) may reflect the importance of the first similarity (or the second similarity) in the determination of the similarity between the local map and each of the plurality of cells. For example, if there are mountains, buildings, or canyons presented in the local map, the processing engine 122 may determine a relatively large second weight corresponding to the second similarity compared with the first weight corresponding to the first similarity.
  • the processing engine 122 may determine a relatively large first weight corresponding to the first similarity compared with the second weight corresponding to the second similarity.
  • the first weight and the second weight may be set manually by a user, or be determined by one or more components of the autonomous driving system 100 according to default settings.
  • the processing engine 122 may determine the similarity between the local map and each of the plurality of cells based on the first weight, the second weight, the first similarity, and the second similarity.
  • the similarity between the local map and each of the plurality of cells may be determined according to Equation (3) :
  • P refers to a similarity between the local map and a cell
  • P 1 refers to a first similarity between the local map and the cell
  • P 2 refers to a second similarity between the local map and the cell
  • refers to a first weight corresponding to the first similarity between the local map and the cell
  • ( ⁇ -1) refers to a second weight corresponding to the second similarity between the local map and the cell.
  • one or more operations may be performed simultaneously.
  • operation 710 and operation 720 may be performed simultaneously.
  • the order of one or more operations may be changed.
  • operation 720 may be performed before operation 710.
  • FIG. 8 is a schematic diagram illustrating an exemplary process for determining a target location of a subject according to some embodiments of the present disclosure.
  • process 800 may illustrate the process for determining the target location of the subject (e.g., the vehicle 110) in combination with process 500 in FIG. 5, process 600 in FIG. 6, and process 700 in FIG. 7.
  • the processing engine 122 may obtain an initial pose (e.g., the estimation location) of the subject as described in connection with operation 510.
  • the processing engine 122 may obtain a reference map associated with the initial pose of the subject based on a HD map service as described in connection with operation 540.
  • the reference map may include intensity information and elevation information associated with each data point in the reference map.
  • the processing engine 122 may generate a local map based on point-cloud data associated with the initial pose of the subject as described in connection with operation 520 and operation 530.
  • the local map may include intensity information and elevation information associated with each data point in the local map.
  • the processing engine 122 may match the reference map and the local map by using a NCC technique as described in connection with operation 550.
  • the processing engine 122 may determine a similarity between the local map and each of a plurality of cells in the reference map.
  • the processing engine 122 may determine one or more probability maps (also referred to as a similarity distribution map) based on the similarity between the local map and each of the plurality of cells in the reference map as described in connection with operation 610.
  • the processing engine 122 may determine an exact location (e.g., the target location) of the subject in the reference map based on one of the plurality of cells that matches with the local map as described in connection with operation 620 and operation 630. In 860, the processing engine 122 may transform the extract location in the reference map to an absolute position in a geographic coordinate system as described in connection with operation 630.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in a combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2103, Perl, COBOL 2102, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may execute entirely on the user’s computer, partly on the user’s computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user’s computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service
  • the numbers expressing quantities or properties used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ”
  • “about, ” “approximate, ” or “substantially” may indicate ⁇ 20%variation of the value it describes, unless otherwise stated.
  • the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment.
  • the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)
PCT/CN2019/085637 2019-04-09 2019-05-06 Systems and methods for positioning WO2020206774A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910279859.3A CN111854748B (zh) 2019-04-09 2019-04-09 一种定位系统和方法
CN201910279859.3 2019-04-09

Publications (1)

Publication Number Publication Date
WO2020206774A1 true WO2020206774A1 (en) 2020-10-15

Family

ID=72752181

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/085637 WO2020206774A1 (en) 2019-04-09 2019-05-06 Systems and methods for positioning

Country Status (2)

Country Link
CN (1) CN111854748B (zh)
WO (1) WO2020206774A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112406964B (zh) * 2020-11-10 2022-12-02 北京埃福瑞科技有限公司 一种列车定位方法及系统
CN115068644B (zh) * 2022-05-11 2024-03-15 深圳市优必选科技股份有限公司 机器人及其消杀控制方法、装置及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108268483A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 生成用于无人车导航控制的网格地图的方法
CN108268518A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 生成用于无人车导航控制的网格地图的装置
CN108398705A (zh) * 2018-03-06 2018-08-14 广州小马智行科技有限公司 地图生成方法、装置及车辆定位方法、装置
CN109064506A (zh) * 2018-07-04 2018-12-21 百度在线网络技术(北京)有限公司 高精度地图生成方法、装置及存储介质

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310482B (zh) * 2012-03-12 2016-08-10 山东智慧生活数据系统有限公司 一种三维重建方法及系统
US11137255B2 (en) * 2015-08-03 2021-10-05 Tomtom Global Content B.V. Methods and systems for generating and using localization reference data
CN106202543A (zh) * 2016-07-27 2016-12-07 苏州家佳宝妇幼医疗科技有限公司 基于机器学习的本体匹配方法和系统
CN106842226A (zh) * 2017-01-19 2017-06-13 谢建平 基于激光雷达的定位系统及方法
CN109285188B (zh) * 2017-07-21 2020-04-21 百度在线网络技术(北京)有限公司 用于生成目标物体的位置信息的方法和装置
CN108007453A (zh) * 2017-12-11 2018-05-08 北京奇虎科技有限公司 基于点云的地图更新方法、装置及电子设备
CN108121800B (zh) * 2017-12-21 2021-12-21 北京百度网讯科技有限公司 基于人工智能的信息生成方法和装置
CN108549375A (zh) * 2018-04-16 2018-09-18 戴姆勒股份公司 高精度地图中基于随机优化的点状对象精度质量评估方法
CN109493407B (zh) * 2018-11-19 2022-03-25 腾讯科技(深圳)有限公司 实现激光点云稠密化的方法、装置及计算机设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108268483A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 生成用于无人车导航控制的网格地图的方法
CN108268518A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 生成用于无人车导航控制的网格地图的装置
CN108398705A (zh) * 2018-03-06 2018-08-14 广州小马智行科技有限公司 地图生成方法、装置及车辆定位方法、装置
CN109064506A (zh) * 2018-07-04 2018-12-21 百度在线网络技术(北京)有限公司 高精度地图生成方法、装置及存储介质

Also Published As

Publication number Publication date
CN111854748B (zh) 2022-11-22
CN111854748A (zh) 2020-10-30

Similar Documents

Publication Publication Date Title
US20220138896A1 (en) Systems and methods for positioning
Vivacqua et al. A low cost sensors approach for accurate vehicle localization and autonomous driving application
US11965744B2 (en) Systems and methods for indoor positioning
CN103575267B (zh) 使图像与用于导航的地形高程地图相关的方法
US20220187843A1 (en) Systems and methods for calibrating an inertial measurement unit and a camera
US20210019535A1 (en) Systems and methods for pose determination
AU2018282435B1 (en) Vehicle positioning system using LiDAR
KR102013802B1 (ko) 드론을 이용한 하천 하상 지형 조사 시스템 및 그 구동방법
US20220171060A1 (en) Systems and methods for calibrating a camera and a multi-line lidar
CN111308415B (zh) 一种基于时间延迟的在线估计位姿的方法和设备
WO2020206774A1 (en) Systems and methods for positioning
WO2021212294A1 (en) Systems and methods for determining a two-dimensional map
CN112041210B (zh) 用于自动驾驶的系统和方法
US11940279B2 (en) Systems and methods for positioning
Matsuura et al. High-precision plant height measurement by drone with RTK-GNSS and single camera for real-time processing
CN112146627B (zh) 在无特征表面上使用投影图案的飞行器成像系统
WO2021077315A1 (en) Systems and methods for autonomous driving
WO2021012243A1 (en) Positioning systems and methods
US20220178701A1 (en) Systems and methods for positioning a target subject
WO2021212297A1 (en) Systems and methods for distance measurement
US20220270288A1 (en) Systems and methods for pose determination
WO2021051358A1 (en) Systems and methods for generating pose graph
Valerievich et al. Experimental assessment of the distance measurement accuracy using the active-pulse television measuring system and a digital terrain model
US20220187432A1 (en) Systems and methods for calibrating a camera and a lidar
Han et al. Mapping road surface features using single-camera images acquired by a mobile mapping system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19924119

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19924119

Country of ref document: EP

Kind code of ref document: A1