WO2021012243A1 - Positioning systems and methods - Google Patents

Positioning systems and methods Download PDF

Info

Publication number
WO2021012243A1
WO2021012243A1 PCT/CN2019/097609 CN2019097609W WO2021012243A1 WO 2021012243 A1 WO2021012243 A1 WO 2021012243A1 CN 2019097609 W CN2019097609 W CN 2019097609W WO 2021012243 A1 WO2021012243 A1 WO 2021012243A1
Authority
WO
WIPO (PCT)
Prior art keywords
cell
characteristic value
elevation
subject
current iteration
Prior art date
Application number
PCT/CN2019/097609
Other languages
French (fr)
Inventor
Xiaozhi Qu
Baohua ZHU
Shengsheng HAN
Tingbo Hou
Original Assignee
Beijing Voyager Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Voyager Technology Co., Ltd. filed Critical Beijing Voyager Technology Co., Ltd.
Priority to PCT/CN2019/097609 priority Critical patent/WO2021012243A1/en
Priority to CN201980045243.8A priority patent/CN112384756B/en
Publication of WO2021012243A1 publication Critical patent/WO2021012243A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C5/00Measuring height; Measuring distances transverse to line of sight; Levelling between separated points; Surveyors' levels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • This present disclosure generally relates to positioning systems and methods, and in particular, to systems and methods for determining an absolute elevation of a subject automatically, e.g., in an autonomous driving context.
  • Positioning technologies are widely used in various fields, such as navigation systems, e.g., navigation for autonomous driving systems.
  • a subject e.g., the autonomous vehicle
  • a coordinate system e.g., the standard coordinate system of the Earth
  • the position of the subject may be represented by, for example, a longitude, a latitude, and an absolute elevation of the subject in the coordinate system.
  • the position of the subject may be determined based on a point-cloud data acquired by one or more sensors (e.g., a LiDAR device) mounted on the subject.
  • the point-cloud data may include limited information that can be used to determine the absolute elevation of the subject in the coordinate system. Therefore, it is desirable to provide effective systems and methods for determining the absolute elevation of the subject in the coordinate system, thus improving positioning accuracy and efficiency.
  • a system may be provided.
  • the system may be configured for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment.
  • the system may include at least one storage medium including a set of instructions, and at least one processor in communication with the at least one storage medium.
  • the at least one processor may be configured to direct the system to perform following operations. For each cell of a plurality of cells in the surrounding environment, the system may obtain a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point.
  • the at least one feature value may be acquired by a sensor assembled on the subject.
  • the at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor.
  • the system may further determine at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor.
  • the at least one first characteristic value may represent absolute elevations of the corresponding physical points in the coordinate system.
  • the system may obtain at least one first reference characteristic value of the cell based on a location information database.
  • the at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell in the coordinate system.
  • the system may also determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system.
  • the updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
  • the system may obtain the data points representative of the surrounding environment, and divide the data points representative of the surrounding environment into the plurality of sets of data points corresponding to the plurality of cells of the surrounding environment.
  • the at least one feature of the physical point further may include a secondary feature of the physical point.
  • the system may further determine at least one second characteristic value of the cell based on the feature values of the secondary feature of the corresponding physical points.
  • the at least one second characteristic value may represent the secondary feature of the corresponding physical points.
  • the system may also obtain at least one second reference characteristic value of the cell based on the location information database.
  • the at least one second reference characteristic value represents the secondary feature of the reference physical points in the cell.
  • the comparing the physical points with the reference physical points in each cell may be further based on the at least one second characteristic value of the each cell and the at least one second reference characteristic value of the each cell.
  • the secondary feature may include at least one of an intensity, a coordinate in a reference coordinate system associated with the sensor, a classification, or a scan direction.
  • the secondary feature may be an intensity.
  • the at least one second characteristic value of the cell may include a characteristic value indicating an overall intensity of the cell and a characteristic value indicating an intensity distribution of the cell.
  • the at least one first characteristic value may include a third characteristic value indicating an overall absolute elevation of the cell and a fourth characteristic value indicating an absolute elevation distribution of the cell.
  • the system may determine a preliminary third characteristic value and a preliminary fourth characteristic value of the cell based on the relative elevations of the corresponding physical points.
  • the preliminary third characteristic value may indicate an overall relative elevation of the cell.
  • the preliminary fourth characteristic value may indicate a relative elevation distribution of the cell.
  • the system may also determine the third characteristic value based on the preliminary third characteristic value and the hypothetic elevation, and designate the preliminary fourth characteristic value as the fourth characteristic value.
  • the preliminary third characteristic value of the cell may be a mean value of the relative elevations of the corresponding physical points.
  • the preliminary fourth characteristic value of the cell may be a covariance of the relative elevations of the corresponding physical points.
  • the determining the absolute elevation of the subject in the coordinate system includes one or more iterations.
  • each current iteration may include determining a similarity degree between the corresponding physical points and the corresponding reference physical points in the current iteration based on the at least one first characteristic value of the cell in the current iteration and the at least one first reference characteristic value of the cell.
  • each current iteration may also include updating the hypothetic elevation in the current iteration based on the similarity degrees of the plurality of cells in the current iteration, and determining whether a termination condition is satisfied in the current iteration.
  • each current iteration may further include designating the updated hypothetic elevation in the current iteration as the absolute elevation.
  • each current iteration in response to a determination that the termination condition is not satisfied, may also include updating the at least one first characteristic value of each cell in the current iteration based on the updated hypothetic elevation in the current iteration, designating the updated hypothetic elevation in the current iteration as the hypothetic elevation in a next iteration, and designating the updated first characteristic value of each cell in the current iteration as the first characteristic value of the cell in the next iteration.
  • the determining the absolute elevation of the subject in the coordinate system may be based on a particle filtering technique.
  • the determining the absolute elevation of the subject in the coordinate system may further include assigning a plurality of particles to the plurality of cells, each of the plurality of particles having a state and being assigned to one or more cells of the plurality of cells.
  • the updating the hypothetic elevation in the current iteration may include determining a weight for each particle of the plurality of particles in the current iteration based on the similarity degrees of the plurality of cells in the current iteration, and determining the updated hypothetic elevation in the current iteration based on the weights and the states of the plurality of particles in the current iteration.
  • the updating the hypothetic elevation in the current iteration may further include updating the plurality of particles in the current iteration based on the weights of the plurality of particles in the current iteration, and designating the updated particles in the current iteration as the plurality of particles in the next iteration.
  • the system may obtain a reference set of reference data points corresponding to the cell from the location information database.
  • Each reference data point in the reference set may represent a reference physical point in the cell and include an absolute elevation of the reference physical point in the coordinate system.
  • the system may also determine the at least one first reference characteristic value of the cell based on the corresponding reference set.
  • the system may further obtain one or more reference absolute elevations of one or more reference objects based on the estimated location of the subject based on the location information database, and determine the hypothetic elevation of the subject in the coordinate system based on the one or more reference absolute elevations.
  • a method may be configured for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment.
  • the method may include following operations. For each cell of a plurality of cells in the surrounding environment, the method may include obtaining a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point.
  • the at least one feature value may be acquired by a sensor assembled on the subject.
  • the at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor.
  • the method may also include determining at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor.
  • the at least one first characteristic value may represent absolute elevations of the corresponding physical points in the coordinate system.
  • the method may further include obtaining at least one first reference characteristic value of the cell based on a location information database.
  • the at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell in the coordinate system.
  • the method may further include determining the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system.
  • the updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
  • a non-transitory computer readable medium embodying a computer program product may include instructions for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment.
  • the instructions may be configured to cause a computing device to perform following operations. For each cell of a plurality of cells in the surrounding environment, the computing device may obtain a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point.
  • the at least one feature value may be acquired by a sensor assembled on the subject.
  • the at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor.
  • the computing device may also determine at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor.
  • the at least one first characteristic value may represent absolute elevations of the corresponding physical points in the coordinate system.
  • the computing device may further obtain at least one first reference characteristic value of the cell based on a location information database.
  • the at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell in the coordinate system.
  • the computing device may also determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system.
  • the updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
  • a system may be configured for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment.
  • the system may include an obtaining module, a characteristic determination module, and an absolute elevation determination module.
  • the obtaining module may be configured to, for each cell of a plurality of cells in the surrounding environment, obtain a set of data points representative of the cell.
  • Each data point in the set may represent a physical point in the cell and include at least one feature value of at least one feature of the physical point.
  • the at least one feature value may be acquired by a sensor assembled on the subject.
  • the at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor.
  • the characteristic determination module may be configured to, for each cell, determine at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor.
  • the at least one first characteristic value may represent absolute elevations of the corresponding physical points in the coordinate system.
  • the obtaining module may be further configured to for each cell, obtain at least one first reference characteristic value of the cell based on a location information database.
  • the at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell.
  • the absolute elevation determination module may be configured to determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system.
  • the updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
  • FIG. 1 is a schematic diagram illustrating an exemplary autonomous driving system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device according to some embodiments of the present disclosure
  • FIG. 4 is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
  • FIG. 5 is a flowchart illustrating an exemplary process for determining an absolute elevation of a subject in a coordinate system according to some embodiments of the present disclosure
  • FIG. 6 is a flowchart illustrating an exemplary process for obtaining a plurality of sets of data points corresponding to a plurality of cells in a surrounding environment of a subject according to some embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an exemplary process for determining one or more first characteristic values of a cell according to some embodiments of the present disclosure.
  • FIG. 8 is a schematic diagram illustrating an exemplary process for determining an absolute elevation of a subject according to some embodiments of the present disclosure.
  • the flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
  • the systems and methods disclosed in the present disclosure are described primarily regarding determining an absolute elevation of a subject (e.g., an autonomous vehicle) in an autonomous driving system. It should be understood that this is only one exemplary embodiment.
  • the systems and methods of the present disclosure may be applied to any other kind of transportation system.
  • the systems and methods of the present disclosure may be applied to transportation systems of different environments including land, ocean, aerospace, or the like, or any combination thereof.
  • the vehicle of the transportation systems may include a taxi, a private car, a hitch, a bus, a train, a bullet train, a high- speed rail, a subway, a vessel, an aircraft, a spaceship, a hot-air balloon, or the like, or any combination thereof.
  • An aspect of the present disclosure relates to systems and methods for determining an absolute elevation of a subject in a coordinate system (e.g., a standard coordinate system for the Earth) .
  • the absolute elevation of the subject in the coordinate system may refer to a height of the subject above or below a fixed reference point, line, or plane defined by the coordinate system.
  • the systems and methods may obtain a plurality of sets of data points, each of which represents a cell in a surrounding environment of the subject.
  • the sets of data points may be acquired by a sensor (e.g., a LiDAR device) mounted on the subject.
  • Each data point in a set representative of a cell may be associated with a physical point in the cell and include one or more feature values of one or more features of the physical point.
  • the feature (s) of a physical point may include a relative elevation of the physical point with respect to the sensor.
  • the systems and methods may determine one or more first characteristic values representing absolute elevations of the physical points in the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the physical points in the cell.
  • the systems and methods may also obtain or determine one or more first reference characteristic values representing absolute elevations of a plurality of reference physical points in the cell based on a location information database (e.g., a pre-built high-definition (HD) map covering the surrounding environment) .
  • a location information database e.g., a pre-built high-definition (HD) map covering the surrounding environment
  • the systems and methods may further determine the absolute elevation of the subject by updating the hypothetic elevation of the subject.
  • the update of the hypothetic elevation may include a comparison between the physical points and the reference physical points in each cell according to the first characteristic value (s) and the first reference characteristic value (s) of each cell.
  • the absolute elevation of the subject may be determined based on a comparison of the physical points in the surrounding environment detected by the sensor and the corresponding reference physical points stored in the location information database.
  • the comparison may be performed for each cell in a parallel manner instead of for the whole surrounding environment, which may improve computational efficiency and reduce processing time.
  • the physical points in a cell may be compared with the corresponding reference physical points according to one or more characteristic values and one or more reference characteristic values representing one or more features of the cell. This may be more efficient compared with comparing the physical points with the reference physical points in the cell directly according to feature values of the physical points and the corresponding reference physical points.
  • FIG. 1 is a schematic diagram illustrating an exemplary autonomous driving system 100 according to some embodiments of the present disclosure.
  • the autonomous driving system 100 may include one or more vehicles 110 (vehicles 110-1, 110-2...110-n) , a server 120, a terminal device 130, a storage device 140, a network 150, and a navigation system 160 (also referred to as a positioning system) .
  • a vehicle 110 may carry a passenger to travel to a destination.
  • the vehicle 110 may be an autonomous vehicle.
  • the autonomous vehicle may refer to a vehicle that is capable of achieving a certain level of driving automation.
  • Exemplary levels of driving automation may include a first level at which the vehicle is mainly supervised by a human and has a specific autonomous function (e.g., autonomous steering or accelerating) , a second level at which the vehicle has one or more advanced driver assistance systems (ADAS) (e.g., an adaptive cruise control system, a lane-keep system) that can control the braking, steering, and/or acceleration of the vehicle, a third level at which the vehicle is able to drive autonomously when one or more certain conditions are met, a fourth level at which the vehicle can operate without human input or oversight but still is subject to some constraints (e.g., be confined to a certain area) , a fifth level at which the vehicle can operate autonomously under all circumstances, or the like, or any combination thereof.
  • ADAS advanced driver assistance systems
  • the vehicle 110 may be an electric vehicle, a fuel cell vehicle, a hybrid vehicle, a conventional internal combustion engine vehicle, or any other type of vehicle.
  • the vehicle 110 may be a sports vehicle, a coupe, a sedan, a pick-up truck, a station wagon, a sports utility vehicle (SUV) , a minivan, a conversion van, or have any other style.
  • the vehicle 110 may include one or more similar components as a conventional vehicle, for example, a chassis, a suspension, a steering device (e.g., a steering wheel) , a brake device (e.g., a brake pedal) , an accelerator, etc.
  • the vehicle 110 may have a body and at least one wheel, e.g., a pair of front wheels and a pair of rear wheels.
  • the vehicle 110 may be all-wheel drive (AWD) , front wheel drive (FWR) , or rear wheel drive (RWD) .
  • AWD all-wheel drive
  • FWR front wheel drive
  • RWD rear wheel drive
  • the vehicle 110 may be operated by an operator occupying the vehicle, under a remote control, and/or autonomously.
  • the vehicle 110 may be a survey vehicle configured to acquire data for constructing an HD map or three-dimensional (3D) city model.
  • the vehicle110 may be equipped with one or more sensors 112 such that the vehicle 110 is capable of sensing its surrounding environment.
  • the sensor (s) 112 may be mounted on the vehicle 110 using any suitable mounting mechanism.
  • the mounting mechanism may be an electro-mechanical device installed or otherwise attached to the body of the vehicle 110.
  • the mounting mechanism may use one or more screws, adhesives, or another mounting mechanism.
  • the sensor (s) 112 may be mounted on any position of the vehicle 110, for example, inside or outside the body of the vehicle.
  • the sensor (s) 112 of the vehicle 110 may include any sensor that is capable of collecting information related to the surrounding environment of the vehicle 110.
  • the sensor (s) 112 may include a light detection and ranging (LiDAR) device, a camera, a GPS device, an inertial measurement unit (IMU) sensor, a radar, a sonar, or the like, or any combination thereof.
  • the LiDAR device may be configured to scan the surrounding environment and acquire point-cloud data representative of the surrounding environment.
  • the LiDAR device may measure a distance to an object by illuminating the object with light pulses and measuring the reflected pulses. Differences in light return times and wavelengths may then be used to construct a 3D representation of the object.
  • the light pulses used by the LiDAR device may be ultraviolet, visible, near infrared, etc.
  • the camera may be configured to obtain one or more images relating to objects (e.g., a person, an animal, a tree, a roadblock, a building, or a vehicle) that are within the scope of the camera.
  • the GPS device may be configured to receive geo-location and time information from GPS satellites and then determine the device's geographical position.
  • the IMU sensor may be configured to measure and provide the vehicle’s specific force, angular rate, and sometimes the magnetic field surrounding the vehicle, using one or more inertial sensors, such as an accelerometer and a gyroscope.
  • the server 120 may be a single server or a server group.
  • the server group may be centralized or distributed (e.g., the server 120 may be a distributed system) .
  • the server 120 may be local or remote.
  • the server 120 may access information and/or data stored in the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, and/or the navigation system 160 via the network 150.
  • the server 120 may be directly connected to the terminal device 130, the sensor (s) 112, the vehicle 110, and/or the storage device 140 to access stored information and/or data.
  • the server 120 may be implemented on a cloud platform or an onboard computer.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the server 120 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2 in the present disclosure.
  • the server 120 may include a processing device 122.
  • the processing device 122 may process information and/or data associated with the autonomous driving system 100 to perform one or more functions described in the present disclosure. For example, the processing device 122 may determine a position (e.g., an absolute elevation) of the vehicle 110 in a coordinate system (e.g., a world geodetic system) according to data associated with a surrounding environment collected by the sensor (s) 112. Particularly, in certain embodiments, the sensor (s) 112 may continuously or intermittently (e.g., periodically or irregularly) collect data associated with the surrounding environment when the vehicle 110 moves. The processing device 122 may determine the position of the vehicle 110 in the coordinate system in real-time or intermittently (e.g., periodically or irregularly) .
  • a position e.g., an absolute elevation
  • a coordinate system e.g., a world geodetic system
  • the sensor (s) 112 may continuously or intermittently (e.g., periodically or irregularly) collect data associated with the surrounding
  • the processing device 122 may include one or more processing devices (e.g., single-core processing device (s) or multi-core processor (s) ) .
  • the processing device 122 may include a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a digital signal processor (DSP) , a field programmable gate array (FPGA) , a programmable logic device (PLD) , a controller, a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • ASIP application-specific instruction-set processor
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • PLD programmable logic device
  • controller
  • the server 120 may be connected to the network 150 to communicate with one or more components (e.g., the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, and/or the navigation system 160) of the autonomous driving system 100.
  • the server 120 may be directly connected to or communicate with one or more components (e.g., the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, and/or the navigation system 160) of the autonomous driving system 100.
  • the server 120 may be integrated into the vehicle 110.
  • the server 120 may be a computing device (e.g., a computer) installed in the vehicle 110.
  • the terminal device 130 may enable a user interaction between a user (e.g., a driver of the vehicle 110) and one or more components of the autonomous driving system 100.
  • the terminal device 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, a built-in device in a vehicle 130-4, a smart watch 130-5, or the like, or any combination thereof.
  • the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
  • the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
  • the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
  • the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
  • PDA personal digital assistant
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a Google TM Glass, an Oculus Rift, a HoloLens, a Gear VR, etc.
  • the built-in device in the vehicle 130-4 may include an onboard computer, an onboard television, etc.
  • the server 120 may be integrated into the terminal device 130.
  • the terminal device 130 may be integrated into the vehicle 110.
  • the storage device 140 may store data and/or instructions.
  • the storage device 140 may store data obtained from the terminal device 130, the sensor (s) 112, the vehicle 110, the navigation system 160, the processing device 122, and/or an external storage device.
  • the storage device 140 may store an HD map of an area (e.g., a country, a city, a street) and/or one or more characteristic values of the area (e.g., a mean absolute elevation of physical points in the area) .
  • the storage device 140 may store data and/or instructions that the server 120 may execute or use to perform exemplary methods described in the present disclosure.
  • the storage device 140 may store instructions that the processing device 122 may execute or use to determine an absolute elevation of the vehicle 110 in a coordinate system (e.g., a world geodetic system) .
  • the storage device 140 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • Exemplary mass storage devices may include a magnetic disk, an optical disk, a solid-state drive, etc.
  • Exemplary removable storage devices may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memory may include a random access memory (RAM) .
  • Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically-erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • the storage device 140 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage device 140 may be connected to the network 150 to communicate with one or more components (e.g., the server 120, the terminal device 130, the sensor (s) 112, the vehicle 110, and/or the navigation system 160) of the autonomous driving system 100.
  • One or more components of the autonomous driving system 100 may access the data or instructions stored in the storage device 140 via the network 150.
  • the storage device 140 may be directly connected to or communicate with one or more components (e.g., the server 120, the terminal device 130, the sensor (s) 112, the vehicle 110, and/or the navigation system 160) of the autonomous driving system 100.
  • the storage device 140 may be part of the server 120.
  • the storage device 140 may be integrated into the vehicle 110.
  • the network 150 may facilitate the exchange of information and/or data.
  • one or more components e.g., the server 120, the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, or the navigation system 160
  • the server 120 may obtain data related to a surrounding environment of the vehicle 110 from the sensor (s) 112 mounted on the vehicle 110 via the network 150.
  • the network 150 may be any type of wired or wireless network, or combination thereof.
  • the network 150 may include a cable network, a wireline network, an optical fiber network, a tele communications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a wide area network (WAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a near field communication (NFC) network, or the like, or any combination thereof.
  • the network 150 may include one or more network access points.
  • the network 150 may include wired or wireless network access points (e.g., 150-1, 150-2) , through which one or more components of the autonomous driving system 100 may be connected to the network 150 to exchange data and/or information.
  • the navigation system 160 may determine information associated with an object, for example, one or more terminal devices 130, the vehicle 110, etc.
  • the navigation system 160 may be a global positioning system (GPS) , a global navigation satellite system (GLONASS) , a compass navigation system (COMPASS) , a BeiDou navigation satellite system, a Galileo positioning system, a quasi-zenith satellite system (QZSS) , etc.
  • the information may include a location, an elevation, a velocity, or an acceleration of the object, or a current time.
  • the navigation system 160 may include one or more satellites, for example, a satellite 160-1, a satellite 160-2, and a satellite 160-3.
  • the satellites 160-1 through 160-3 may determine the information mentioned above independently or jointly.
  • the navigation system 160 may send the information mentioned above to the network 150, the terminal device 130, or the vehicle 110 via wireless connections.
  • the autonomous driving system 100 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure.
  • the autonomous driving system 100 may further include one or more additional components, such as an information source, a location information database (which may be an independent part of the autonomous driving system 100 or integrated into the storage device 140) .
  • one or more components of the autonomous driving system 100 may be omitted or be replaced by one or more other devices that can realize similar functions.
  • a GPS device of a vehicle 110 may be replaced by another positioning device, such as BeiDou.
  • BeiDou another positioning device
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and software components of a computing device 200 according to some embodiments of the present disclosure.
  • the computing device 200 may be used to implement any component of the autonomous driving system 100 as described herein.
  • the server 120 e.g., the processing device 122
  • the terminal device 130 may be implemented on the computing device 200, via its hardware, software program, firmware, or a combination thereof.
  • the computer functions relating to the autonomous driving system 100 as described herein may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load.
  • the computing device 200 may include a communication bus 210, a processor 220, a storage device, an input/output (I/O) 260, and a communication port 250.
  • the processor 220 may execute computer instructions (e.g., program code) and perform functions of one or more components of the autonomous driving system 100 in accordance with techniques described herein. For example, the processor 220 may determine an absolute elevation of the vehicle 110 in a coordinate system (e.g., a world geodetic system) .
  • the computer instructions may include, for example, routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions described herein.
  • the processor 220 may include interface circuits and processing circuits therein.
  • the interface circuits may be configured to receive electronic signals from the communication bus 210, wherein the electronic signals encode structured data and/or instructions for the processing circuits to process.
  • the processing circuits may conduct logic calculations, and then determine a conclusion, a result, and/or an instruction encoded as electronic signals. Then the interface circuits may send out the electronic signals from the processing circuits via the communication bus 210.
  • the processor 220 may include one or more hardware processors, such as a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
  • RISC reduced instruction set computer
  • ASICs application specific integrated circuits
  • ASIP application-specific instruction-set processor
  • CPU central processing unit
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ARM advanced RISC machine
  • processor 220 is described in the computing device 200.
  • the computing device 200 in the present disclosure may also include multiple processors, thus operations and/or method steps that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors.
  • the processor of the computing device 200 executes both step A and step B, it should be understood that step A and step B may also be performed by two or more different processors jointly or separately in the computing device 200 (e.g., a first processor executes step A and a second processor executes step B, or the first and second processors jointly execute steps A and B) .
  • the storage device may store data/information related to the autonomous driving system 100.
  • the storage device may include a mass storage device, a removable storage device, a volatile read-and-write memory, a random access memory (RAM) 240, a read-only memory (ROM) 230, a disk 270, or the like, or any combination thereof.
  • the storage device may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure.
  • the storage device may store a program for the processor 220 to execute.
  • the I/O 260 may input and/or output signals, data, information, etc. In some embodiments, the I/O 260 may enable a user interaction with the computing device 200. In some embodiments, the I/O 260 may include an input device and an output device. Examples of the input device may include a keyboard, a mouse, a touch screen, a microphone, or the like, or a combination thereof. Examples of the output device may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof.
  • Examples of the display device may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , a touch screen, or the like, or a combination thereof.
  • LCD liquid crystal display
  • LED light-emitting diode
  • CRT cathode ray tube
  • the communication port 250 may be connected to a network (e.g., the network 120) to facilitate data communications.
  • the communication port 250 may establish connections between the computing device 200 and one or more components of the autonomous driving system 100.
  • the connection may be a wired connection, a wireless connection, any other communication connection that can enable data transmission and/or reception, and/or any combination of these connections.
  • the wired connection may include, for example, an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof.
  • the wireless connection may include, for example, a Bluetooth TM link, a Wi-Fi TM link, a WiMax TM link, a WLAN link, a ZigBee link, a mobile network link (e.g., 3G, 4G, 5G, etc.
  • the communication port 250 may be and/or include a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 250 may be a specially designed communication port.
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device 300 according to some embodiments of the present disclosure.
  • one or more components e.g., the processing device 122 and/or the terminal device 130
  • the mobile device 300 may include a communication platform 310, a display 320, a graphics processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and storage 390.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • a mobile operating system 370 e.g., iOS TM , Android TM , Windows Phone TM
  • one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340.
  • the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to positioning or other information from the processing device 122.
  • User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device 122 and/or other components of the autonomous driving system 100 via the network 150.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device.
  • PC personal computer
  • a computer may also act as a server if appropriately programmed.
  • FIG. 4 is a block diagram illustrating an exemplary processing device 122 according to some embodiments of the present disclosure.
  • the processing device 122 may include an obtaining module 410, a characteristic determination module 420, and an absolute elevation determination module 430.
  • the obtaining module 410 may obtain information related to the autonomous driving system 100.
  • the obtaining module may obtain point-cloud data representative of a surrounding environment of a subject, a plurality of sets of data points representing a plurality of cells in the surrounding environment, one or more reference characteristic values of the cells, an estimated location of the subject, or the like, or any combination thereof. Details regarding the information obtained by the obtaining module 410 may be found elsewhere in the present disclosure (e.g., FIG. 5 and the relevant descriptions thereof) .
  • the characteristic determination module 420 may determine one or more characteristic values and/or one or more reference characteristic values of a cell.
  • the characteristic value (s) of the cell may include, for example, one or more first characteristic values representing absolute elevations of a plurality of physical points in the cell, one or more second characteristic values representing a secondary feature of the physical points in the cell, or the like, or any combination thereof.
  • the reference characteristic value (s) of the cell may include, for example, one or more first reference characteristic values representing absolute elevations of a plurality of reference physical points in the cell, one or more second reference characteristic values representing a secondary feature of the reference physical points in the cell, or the like, or any combination thereof. Details regarding the characteristic value (s) and/or the reference characteristic value (s) may be found elsewhere in the present disclosure (e.g., FIG. 5 and the relevant descriptions thereof) .
  • the absolute elevation determination module 430 may be configured to determine an absolute elevation of the subject in a coordinate system. In some embodiments, the absolute elevation determination module 430 may determine the absolute elevation by updating a hypothetic elevation of the subject in the coordinate system. Details regarding the determination of the absolute elevation may be found elsewhere in the present disclosure (e.g., operation 540 and FIG. 7 and the relevant descriptions thereof) .
  • the modules in the processing device 122 may be connected to or communicate with each other via a wired connection or a wireless connection.
  • the wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof.
  • the wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof.
  • LAN Local Area Network
  • WAN Wide Area Network
  • NFC Near Field Communication
  • Two or more of the modules may be combined into a single module, and any one of the modules may be divided into two or more units. In some embodiments, one or more modules described above may be omitted. Additionally or alternatively, the processing device 122 may include one or more additional modules.
  • the processing device 122 may further include a storage module (not shown in FIG. 4) .
  • one or more modules may be combined into a single module.
  • the characteristic determination module 420 and the absolute elevation determination module 430 may be combined into a single module.
  • FIG. 5 is a flowchart illustrating an exemplary process for determining an absolute elevation of a subject in a coordinate system according to some embodiments of the present disclosure.
  • the process 500 may be executed by the autonomous driving system 100.
  • the process 500 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240.
  • the processing device 122 e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 122 may be configured to perform the process 500.
  • the subject may refer to any composition of organic and/or inorganic matters that are with or without life and located on earth.
  • the subject may be any vehicle (e.g., car, boat, or aircraft) or any person.
  • the subject may be an autonomous vehicle (e.g., the vehicle 110) as described elsewhere in the present disclosure (e.g., FIG. 1 and the relevant descriptions) .
  • the absolute elevation of the subject in the coordinate system may refer to a height of the subject above or below a fixed reference point, line, or plane defined by the coordinate system.
  • the coordinate system may be a standard coordinate system for the Earth that defines a sea level or a geoid of the earth
  • the absolute elevation of the subject in the coordinate system may refer to a height of the subject above or below the sea level or the geoid of the earth.
  • the absolute elevation of an entity e.g., the subject
  • the subject may have a plurality of physical points on or within the subject.
  • the absolute elevation of the subject may be an absolute elevation of any physical point of the subject.
  • the absolute elevation of the subject may be a statistic value (e.g., a mean value) of the absolute elevations of all or a portion of the physical points of the subject.
  • the subject may be an autonomous vehicle having a LiDAR device.
  • the absolute elevation of the autonomous vehicle may be an absolute elevation of the LiDAR device or a mean absolute elevation of the physical points on the autonomous vehicle.
  • the processing device 122 e.g., the obtaining module 410) (e.g., the interface circuits of the processor 220) may obtain a plurality of sets of data points, each of which represents a cell in a surrounding environment of the subject.
  • the surrounding environment of the subject may refer to the circumstances and one or more objects (including living and non-living objects) surrounding the subject.
  • the surrounding environment may cover an area having any size and shape.
  • the area covered by the surrounding environment may be associated with the performance of the sensor assembled on the subject.
  • a surrounding environment of the autonomous vehicle may include one or more objects around the autonomous vehicle, such as a ground surface, a lane marking, a building, a pedestrian, an animal, a plant, one or more other vehicles, or the like.
  • the size of an area covered by the surrounding environment of the autonomous vehicle may depend (or partially depend) on a scanning range of a LiDAR device assembled on the autonomous vehicle.
  • a cell of the surrounding environment may correspond to a subspace of the surrounding environment and include a plurality of physical points in the corresponding subspace, such as physical points on the body surface of an object located in the corresponding subspace.
  • the cell is a virtual concept.
  • the set of data points representative of the cell may include a plurality of data points, each of which represents a physical point in the cell (i.e., in the corresponding subspace) and include at least one feature value of at least one feature of the physical point.
  • Exemplary features of a physical point may include a relative position of the physical point with respect to the sensor (or the subject) , an intensity of the physical point, a classification of the physical point, a scan direction associated with the physical point, or the like, or any combination thereof.
  • the relative position of the physical point with respect to the sensor (or the subject) may be denoted as a coordinate of the physical point in a reference coordinate system associated with the sensor (or the subject) .
  • the reference coordinate system associated with the sensor (or the subject) may be any suitable coordinate system whose origin is located at the sensor (or the subject) .
  • the reference coordinate system may be a 3-dimensional coordinate system having an X-axis, a Y-axis, and a Z-axis.
  • the Y-axis may be parallel with an orientation of the subject.
  • the X-axis may be perpendicular to the Y-axis and form an X-Y plane that is parallel or roughly parallel with a ground surface or a bottom surface of the subject.
  • the Z-axis may be perpendicular to the X-Y plane.
  • the coordinate of the physical point in the reference coordinate system may include one or more of an X-coordinate on the X-axis, a Y-coordinate on the Y-axis, and a Z-coordinate on the X-axis, wherein the Z-coordinate may also be referred to as the relative elevation of the physical point with respect to the sensor.
  • the intensity of the physical point may refer to a strength of returned laser pulse (s) that are reflected by the physical point.
  • the intensity of the physical point may be associated with a property (e.g., the composition and/or material) of the physical point.
  • the classification of the physical point may refer to a type of an object that the physical point belongs.
  • the scan direction associated with the physical point may refer to the direction in which a scanning mirror of the sensor was directed to when the corresponding data point was detected by the sensor.
  • the at least one feature of a physical point recorded by a corresponding data point may be the relative elevation of the physical point with respect to the sensor.
  • the at least one feature of the physical point may include the relative elevation and one or more secondary features of the physical point.
  • a secondary feature may refer to any feature other than the relative elevation.
  • the secondary feature may be the intensity, the X-coordinate in the reference coordinate system, the Y-coordinate in the reference coordinate system, the classification, the scan direction, or any other feature that can be measured by the sensor.
  • the sets of data points representative of the cells may be acquired by a sensor (e.g., the sensor (s) 112) assembled on the subject.
  • the sensor may include one or more LiDAR devices as described in connection with FIG. 1.
  • the sensor may be configured to emit laser pulses to scan the surrounding environment to collect data points representative of physical points in the surrounding environment (also referred to as point-cloud data representative of the surrounding environment herein) .
  • the processing device 122 may obtain the point-cloud data from the sensor or another source that stores the point-cloud data.
  • the processing device 122 may further divide the point-cloud data into the sets of data points corresponding to the cells. Details regarding the obtaining or determination of the sets of data points may be found elsewhere in the present disclosure (e.g., FIG. 6 and the relevant descriptions thereof) .
  • the processing engine 122 e.g., the characteristic determination module 420
  • the processing circuits of the processor 220 may determine one or more first characteristic values and one or more second characteristic values of the cell.
  • the first characteristic value (s) of a cell may represent absolute elevations of the corresponding physical points.
  • the second characteristic value (s) of a cell may represent a secondary feature of the corresponding physical points.
  • the secondary feature may be or include the intensity, the X-coordinate and/or Y-coordinate in the reference coordinate system, the classification, the scan direction, or the like, or any combination thereof.
  • the intensity is described as an example of the secondary feature.
  • the following descriptions are provided with reference to the determination of one or more first characteristic values representing the absolute elevation and one or more second characteristic values representing the intensity for a cell. It should be understood that this is only one exemplary embodiment.
  • the secondary feature may include any feature (s) other than the intensity.
  • the first characteristic value (s) representing the absolute elevation may include a third characteristic value indicating an overall absolute elevation of the cell (i.e., an overall level of the absolute elevations of the corresponding physical points) .
  • the first characteristic value (s) of the cell may include a fourth characteristic value indicating an absolute elevation distribution of the cell (i.e., a distribution of the absolute elevations of the corresponding physical points) .
  • the third characteristic value may be a mean absolute elevation, a median absolute elevation, or any other parameter that can reflect the overall absolute elevation of the cell.
  • the fourth characteristic value may be a covariance, a variance, a standard deviation of the absolute elevations of the physical points in the cell, or any other parameter that can reflect the absolute elevation distribution of the cell.
  • the second characteristic value (s) representing the intensity may include a fifth characteristic value indicating an overall intensity of the cell (i.e., an overall level of the intensities of the corresponding physical points) .
  • the second characteristic value (s) representing the intensity may include a sixth characteristic value indicating an intensity distribution of the cell (i.e., a distribution of the intensities of the corresponding physical points) .
  • the fifth characteristic value may be a mean intensity, a median intensity, or any other parameter that can reflect the overall intensity of the cell.
  • the sixth characteristic value may be a covariance, a variance, a standard deviation of the intensities of the physical points in the cell, or any other parameter that can reflect the intensity distribution of the cell.
  • the first characteristic value (s) of the cell may be determined based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points.
  • the hypothetic elevation of the subject may refer to a hypothetic value of the absolute elevation of the subject in the coordinate system.
  • the processing device 122 may determine one or more characteristic values representing the relative elevations of the physical points in the cell. The processing device 122 may further determine the first characteristic value (s) of the cell based on the hypothetic elevation and the characteristic value (s) representing the relative elevations of the physical points in the cell.
  • the second characteristic value (s) of the cell may be determined based on the feature values of the secondary feature of the physical points in the cell.
  • the processing device 122 may utilize one or more vectors or matrixes to represent the first and the second characteristic value (s) of the cell.
  • a feature vector may be constructed based on the third characteristic value and the fifth characteristic value to represent the overall absolute elevation and the overall intensity of the cell.
  • a covariance matrix may be determined to represent the absolute elevation distribution and the intensity distribution of the cell. Details regarding the hypothetic elevation and/or the determination of the first and the second characteristic value (s) may be found elsewhere in the present disclosure (e.g., FIG. 7 and the relevant descriptions thereof) .
  • the processing engine 122 e.g., the obtaining module 410) (e.g., the interface circuits of the processor 220) may obtain one or more reference characteristic values and one or more second reference characteristic values of the cell based on a location information database.
  • the first reference characteristic value (s) of a cell may represent absolute elevations of a plurality of reference physical points in the cell.
  • the first reference characteristic value (s) of the cell may include a third reference characteristic value indicating an overall level of the absolute elevation of the reference physical points in the cell (also referred to as a reference overall absolute elevation of the cell) , such as a mean or median absolute elevation of the corresponding reference physical points.
  • the first reference characteristic value (s) may include a fourth reference characteristic value indicating a distribution of the absolute elevations of the reference physical points in the cell (also referred to as a reference absolute elevation distribution of the cell) , such as a covariance, a variance, a standard deviation of the absolute elevations of the corresponding reference physical points.
  • the second reference characteristic value (s) may represent the secondary feature of the reference physical points in the cell. Taking the intensity as an exemplary secondary feature, the second reference characteristic value (s) of a cell may include a fifth reference characteristic value indicating an overall level of the intensities of the reference physical points in the cell (also referred to as a reference overall intensity of the cell) , such as a mean or median intensity of the corresponding reference physical points in the cell.
  • the second reference characteristic value (s) of a cell may include a sixth reference characteristic value indicating a distribution of the intensities of the reference physical points in the cell (also referred to as a reference intensity distribution of the cell) , such as a covariance, a variance, a standard deviation of the intensities of the corresponding reference physical points.
  • the location information database may refer to a database that stores location information of a region (acountry or city) covering the surrounding environment of the subject.
  • the location information database may be a local database in the autonomous driving system 100, for example, be a portion of the storage device 140.
  • the location information database may be a remote database, such as a cloud database, which can be accessed by the processing device 122 via the network 150.
  • the location information database may store the first and/or second reference characteristic values of the cells, which may be previously determined and stored in the location information database.
  • the processing device 122 may directly obtain the first and/or second reference characteristic values of the cells from the location information database.
  • the location information database may include a plurality of first reference characteristic values of a plurality of reference cells in the region where the subject locates.
  • the reference cell of the region may be similar to a cell in the surrounding environment.
  • the processing device 122 may determine a reference cell matching the cell (e.g., a reference cell having the same or similar position as the cell in the coordinate system) , and designate the first reference characteristic value of the matched reference cell as the first reference characteristic value of the cell.
  • the location information database may store reference point-cloud data representative of the region.
  • the processing device 122 may determine the first and/or second reference characteristic values of the cells based on the reference point-cloud data representative of the region.
  • the reference point-cloud data may be stored as an HD map of the region.
  • the reference point-cloud data may include a plurality of reference data points representative of a plurality of reference physical points in the region.
  • a reference physical point may refer to a physical point in the region which is detected by a sensor other than the sensor of the subject.
  • a reference data point of a reference physical point may encode one or more feature values of one or more features of the reference physical point.
  • Exemplary features of the reference physical point may include but are not limited to a position of the reference physical point in the coordinate system (e.g., a standard coordinate system for the Earth) , a relative elevation of the reference physical point with respect to a sensor that detects the reference physical point, one or more secondary features (e.g., an intensity, a classification, and/or a scan direction) of the reference physical point, or the like, or any combination thereof.
  • the position of the reference physical point in the coordinate system may include, for example, a longitude, a latitude, and/or an absolute elevation of the reference physical point in the coordinate system.
  • At least a portion of the reference point-cloud data may be previously acquired by a sensor mounted on a sample subject.
  • a survey vehicle e.g., a vehicle 110
  • one or more sensors with high accuracy e.g., a LiDAR device
  • the reference physical points may be inputted by a user or be determined based on the information acquired by the survey vehicle.
  • the absolute elevations of the reference physical points may be determined based on the information acquired by the survey vehicle and stored in the location information database.
  • the absolute elevations of the reference physical points may be confirmed and/or modified by a user before the storing.
  • the processing device 122 may obtain a reference set of reference data points corresponding to the cell from the location information database.
  • Each reference data point in the reference set may represent a reference physical point in the cell and include an absolute elevation of the reference physical point in the coordinate system.
  • the processing device 122 may obtain the reference point-cloud data of the region where the subject locates from the location information database, and further determine the reference sets of the cells based on the reference point-cloud data. For example, the processing device 122 may determine a group of reference physical points in the cell based on the positions of the reference physical points in the coordinate system, wherein the reference data points representative of the group of reference physical points may be regarded as the reference set of the cell.
  • the reference point-cloud data of the region in the location information database may be previously divided into a plurality of reference sets corresponding to the reference cells in the region, for example, by performing a similar manner with the division of the point-cloud data of the surrounding environment as described in connection with operation 620.
  • the processing device 122 may determine a reference cell matching the cell (e.g., a reference cell having the same or similar position as the cell in the coordinate system) , and designate the reference set of the matched reference cell as the reference set of the cell.
  • the processing engine 112 may determine the first reference characteristic value (s) of each cell based on the absolute elevations of the reference physical points in the cell.
  • the processing device 122 may further determine the second reference characteristic value (s) of the cell based on feature values of the secondary feature of the reference physical points in the cell.
  • the first and the second reference characteristic value (s) may be denoted in a similar form as the first and the second characteristic value (s) as described in connection with operation 520, for example, be denoted as one or more vectors or matrixes.
  • the processing engine 122 e.g., the absolute elevation determination module 430
  • the processing circuits of the processor 220 may determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system.
  • the updating of the hypothetic elevation may include comparing the physical points with the reference physical points in each cell based on the first characteristic value (s) , the second characteristic value (s) , the first reference characteristic value (s) , and the second reference characteristic value (s) of each cell. For example, for each cell, the processing device 122 may determine a similarity degree between the corresponding physical points and the corresponding reference physical points based on the first, second, first reference, and second reference characteristic values of each cell. The processing device 122 may further update the hypothetic elevation based on the similarity degrees of the cells. In certain embodiments, the processing device 122 may determine the absolute elevation by performing one or more iterations as described in connection with FIG. 8.
  • the processing device 122 may only determine the first characteristic value (s) of each cell to represent the absolute elevations of the physical points in each cell. In operation 530, the processing device 122 may only obtain the first reference characteristic value (s) of each cell to represent the absolute elevations of the reference physical points in each cell. In operation 540, for each cell, the processing device 122 may compare the physical points with the reference physical points in the cell based on the first characteristic value (s) and the first reference characteristic value (s) of the cell. In some embodiments, the data point of each physical point may include feature values of a plurality of secondary features of the physical point.
  • the processing device 122 may determine one or more second characteristic values and one or more second reference characteristic values of each cell to represent the secondary feature of the cell.
  • the processing device 122 may compare the physical points with the reference physical points in the cell based on the first characteristic value (s) , the first reference characteristic value (s) , the second characteristic values, and the second reference characteristic values of the cell.
  • the subject e.g., the vehicle 110
  • the process 500 may be performed continuously or intermittently (e.g., periodically or irregularly) to update the absolute elevation of the subject continuously or intermittently.
  • the absolute elevation of the subject determined in the process 500 may be an absolute elevation of a specific physical point (e.g., a LiDAR device) of the subject.
  • the processing device 122 may further determine an absolute elevation of another physical point of the subject based on the absolute elevation of the specific physical point and a relative position between the specific physical point and the another physical point.
  • FIG. 6 is a flowchart illustrating an exemplary process for obtaining a plurality of sets of data points corresponding to a plurality of cells in a surrounding environment of a subject according to some embodiments of the present disclosure.
  • the process 600 may be executed by the autonomous driving system 100.
  • the process 600 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240.
  • the processing device 122 e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may be configured to perform the process 600.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, one or more operations of the process 600 may be performed to achieve at least part of operation 510 as described in connection with FIG. 5.
  • the processing engine 122 may obtain data points representative of the surrounding environment of the subject.
  • the obtained data points may also be referred to as point-cloud data representative of the surrounding environment herein.
  • Each data point of the point-cloud data may represent a physical point in the surrounding environment, such as a physical point on the body surface of an object in the surrounding environment.
  • Each data point may include at least one feature value of at least one feature of the corresponding physical point as described in connection with operation 510.
  • the point-cloud data may be acquired by a sensor (e.g., the sensor (s) 112) assembled on the subject, such as one or more LiDAR devices as described elsewhere in the present disclosure (e.g., FIG. 1, and descriptions thereof) .
  • the sensor may emit laser pulses to the surrounding environment.
  • the laser pulses may be reflected by the physical points in the surrounding environment and return to the sensor.
  • the sensor may generate the point-cloud data representative of the surrounding environment based on one or more characterizes of the return laser pulses.
  • the point-cloud data may be collected during a time period (e.g., 1 second, 2 seconds) when the subject (e.g., the vehicle 110) stops on or travels along a road.
  • the senor may rotate in a scanning angle range (e.g., 360 degrees, 180 degrees, 120 degrees) and scan the surrounding environment in a certain scan frequency (e.g., 10Hz, 15Hz, 20 Hz) .
  • a scanning angle range e.g., 360 degrees, 180 degrees, 120 degrees
  • a certain scan frequency e.g., 10Hz, 15Hz, 20 Hz
  • the processing device 122 may obtain the point-cloud data from the sensor, a storage device (e.g., the storage device 140) in the autonomous driving system 100 that store the point-cloud data. Additionally or alternatively, the processing device 122 may obtain the point-cloud data from an external source that stores the point-cloud data via the network 150.
  • a storage device e.g., the storage device 140
  • the processing device 122 may obtain the point-cloud data from an external source that stores the point-cloud data via the network 150.
  • the processing engine 122 (e.g., the obtaining module 410) (e.g., the processing circuits of the processor 220) may divide the point-cloud data into a plurality of sets of data points corresponding to a plurality of cells of the surrounding environment.
  • the processing device 122 may determine a cell that the physical point belongs. The processing device 122 may then classify the data point representative of the physical point into the set of data points corresponding to the specific cell. For illustration purposes, the determination of a cell that a physical point A belongs is described as an example. In some embodiments, the processing device 122 may obtain or determine a reference plane (e.g., an X-Y plane defined by the standard coordinate system of Earth, a plane being parallel or roughly parallel to the ground surface on which the autonomous vehicle moves, a plane defined by a local map associated with the subject) for the surrounding environment.
  • a reference plane e.g., an X-Y plane defined by the standard coordinate system of Earth, a plane being parallel or roughly parallel to the ground surface on which the autonomous vehicle moves, a plane defined by a local map associated with the subject
  • the reference plane may be evenly or unevenly divided into a plurality of 2D cells (each of which may have a known coordinate in the coordinate system according to certain embodiments of the present disclosure) .
  • the 2D cells may have any suitable shapes and/or sizes.
  • the reference plane may be evenly divided into the 2D cells each has the shape of a regular triangle, a rectangle, a square, a regular hexagon, a circle, or the like.
  • each 2D cell may have the shape of a square, for example, a square whose side length is 10 centimeters, 20 centimeters, 30 centimeters, or the like.
  • the processing device 122 may further project the physical point A on a corresponding 2D cell of the reference plane, wherein the corresponding 2D cell may be regarded as a cell of the surrounding environment that the physical point A belongs.
  • the cell may correspond to a sub-space of the surrounding environment that includes a plurality of physical points whose projection points are within the 2D cell.
  • the physical points in the cell may refer to the physical points whose projection points are within the 2D cell.
  • the processing device 122 may determine a coordinate of the projection point of the physical point A on the reference plane, and then determine the corresponding 2D cell of the physical point A based on the coordinate of the projection point.
  • the coordinate of the projection point may be determined based on a positioning technique, for example, a template matching technique, a normalized cross-correlation (NCC) technique, a single shot detector (SSD) technique, a phase correlation technique, or the like, or any combination thereof.
  • the coordinate of the projection point may be determined based on the point-cloud data representative of the surrounding environment and a reference map (e.g., a pre-built HD map including the surrounding environment) , for example, by registering the point-cloud data and data included in the reference map.
  • a reference map e.g., a pre-built HD map including the surrounding environment
  • Exemplary techniques for determining the coordinate of the projection point may be found in, for example, International Applications PCT/CN2019/085637 filed on May 6, 2019, and PCT/CN2019/095816 filed on July 12, 2019, both entitled “SYSTEMS AND METHODS FOR POSITIONING” , the contents of each of which are hereby incorporated by reference.
  • the sets of data points corresponding to the cells may further be processed in parallel, thereby improving the computational efficiency and reducing the processing time.
  • the set of data point of each cell may be analyzed to determine one or more characteristic values of the cell.
  • the one or more characteristic values of each cell may further be compared with one or more reference characteristic values of the cell (which may be obtained from a location information database or be determined based on previously collected data representative of the cell) .
  • the absolute elevation of the subject may be determined according to the comparison result corresponding to each cell.
  • process 600 is merely provided for the purpose of illustration, and not intended to limit the scope of the present disclosure.
  • process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 600 described above is not intended to be limiting.
  • FIG. 7 is a flowchart illustrating an exemplary process for determining one or more first characteristic values of a cell according to some embodiments of the present disclosure.
  • the process 700 may be executed by the autonomous driving system 100.
  • the process 700 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240.
  • the processing device 122 e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 122 may be configured to perform the process 700.
  • the operations of the illustrated process presented below are intended to be illustrative.
  • one or more operations of the process 700 may be performed to achieve at least part of operation 520 as described in connection with FIG. 5.
  • the process 700 may be performed for each cell in the surrounding environment of the subject to determine the first characteristic value (s) of the cell.
  • the first characteristic value (s) of a cell may include a third characteristic value indicating an overall absolute elevation of the cell and/or a fourth characteristic value indicating an absolute elevation distribution of the cell.
  • the following descriptions are described with reference to the determination of the third and fourth characteristic values. It should be understood that this is only one exemplary embodiment.
  • the first characteristic value may only include one of the third and fourth characteristic values.
  • the processing engine 122 may determine a preliminary third characteristic value and a preliminary fourth characteristic value of the cell based on the relative elevations of the physical points in the cell.
  • the preliminary third characteristic value may indicate an overall relative elevation of the cell (i.e., an overall level of the relative elevations of the physical points in the cell) .
  • the preliminary fourth characteristic value may indicate a relative elevation distribution of the cell (i.e., a distribution of the relative elevations of the physical points in the cell) .
  • the preliminary third characteristic value may be a mean relative elevation, a median relative elevation, or any other parameter that can reflect the overall relative elevation of the cell.
  • the preliminary fourth characteristic value may be a covariance, a variance, a standard deviation of the relative elevations of the physical points in the cell, or any other parameter that can reflect the relative elevation distribution of the cell.
  • the processing engine 122 e.g., the characteristic determination module 420
  • the processing circuits of the process 220 may determine the third characteristic value based on the preliminary third characteristic value and a hypothetic elevation of the subject in the coordinate system.
  • the hypothetic elevation may refer to a hypothetic value of the absolute elevation of the subject in the coordinate system.
  • the hypothetic elevation may be a default setting of the autonomous driving system 100, for example, 0 meters, 10 meters, 100 meters, or the like.
  • the hypothetic elevation may be inputted by a user of the autonomous driving system 100.
  • the hypothetic elevation may be determined by the processing device 122.
  • the processing device 122 may obtain an estimated location of the subject.
  • the estimated location may refer to a geographic location where the subject locates.
  • the geographic location may be denoted by a geographic coordinate (e.g., longitudinal and latitudinal coordinates) of the location.
  • the estimated location may be received from a GPS device mounted on the subject and/or the navigation system 160.
  • the processing device 122 may then determine the hypothetic elevation based on the estimated location. For example, the processing device 122 may obtain an average absolute elevation of a city or a region where the subject locates from a location information database (e.g., an HD map) , and designate the average absolute elevation as the hypothetic elevation.
  • a location information database e.g., an HD map
  • the processing device 122 may obtain or determine one or more reference absolute elevations of one or more reference objects based on the estimated location of the subject and the location information database.
  • the reference object (s) may include any object that is located near the estimated location (e.g., within a threshold distance from the estimated location) .
  • Exemplary reference objects may include a ground surface, a lane marking, a building, a pedestrian, an animal, or the like.
  • the reference absolute elevation (s) of the reference object (s) may be obtained directly from the location information database.
  • the processing device 122 may obtain absolute elevations of a plurality of physical points of the reference object from the location information database, and determine a mean or median absolute elevation of the physical points of the reference object as the reference absolute elevation of the reference object. After the reference absolute elevation (s) of reference object (s) are obtained and/or determined, the processing device 122 may determine the hypothetic elevation based on the reference absolute elevation (s) . For example, a mean value or a median value of the reference absolute elevation (s) of the reference object (s) may be determined as the hypothetic elevation.
  • the third characteristic value representing the overall absolute elevation of the cell may be equal to a sum of the preliminary third characteristic value representing the overall relative elevation and the hypothetic elevation.
  • the third characteristic value may be equal to a sum of the preliminary third characteristic value, the hypothetic elevation, and an installation height of the sensor.
  • the installation height of the sensor may refer to a relative height of the sensor to a bottom of the subject (e.g., a chassis of a vehicle 110) .
  • the processing engine 122 e.g., the characteristic determination module 420
  • the processing circuits of the process 220 may designate the preliminary fourth characteristic value representing the relative elevation distribution of the cell as the fourth characteristic value representing the absolute elevation distribution of the cell.
  • process 700 is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure.
  • process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 700 described above is not intended to be limiting.
  • operation 710 may be divided into a first sub-operation to determine the preliminary third characteristic value and a second sub-operation to determine the preliminary fourth characteristic value. Additionally or alternatively, operations 720 and 730 may be performed simultaneously or operation 730 may be performed before operation 720.
  • the processing device 122 may determine only one of the third and fourth characteristic values as the first characteristic value (s) .
  • the preliminary characteristic value (s) representing the relative elevation may be determined and/or denoted together with one or more characteristic values representing one or more other features.
  • the third preliminary characteristic value representing the overall relative elevation may be determined and/or denoted together with the fifth characteristic value representing the overall intensity.
  • a vector P x indicating both an overall relative elevation and an overall intensity of an x th cell may be determined according to Equation (1) as below:
  • n refers to the total number (or count) of physical points in the x th cell
  • k refers to the k th physical point in the x th cell
  • P k refers to [Z k , I k ]
  • Z k and I k refer to the relative elevation and the intensity of the k th physical point, respectively.
  • the vector P x may be denoted as [Z x , I k ] , wherein Z x refers to the mean relative elevation (which is an exemplary preliminary third characteristic value) of the x th cell, and I x refers to the mean intensity (which is an exemplary fifth characteristic value) of the x th cell.
  • a vector P′ x representing an overall absolute elevation and the overall intensity of the x th cell may then be determined based on the vector P x .
  • P′ x may be denoted as [Z x +Z, I x ] , wherein Z refers to the hypothetic elevation of the subject.
  • the processing device 122 may determine a covariance matrix C x to indicate both the relative elevation distribution and the intensity distribution of the x th cell.
  • the covariance matrix C x may be designated as a covariance matrix that indicate both the absolute elevation distribution and the intensity distribution of the x th cell.
  • the covariance matrix C x may be determined according to Equation (2) as below:
  • FIG. 8 is a schematic diagram illustrating an exemplary process for determining an absolute elevation of a subject according to some embodiments of the present disclosure.
  • the process 800 may be executed by the autonomous driving system 100.
  • the process 800 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240.
  • the processing device 122 e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 800.
  • one or more operations of the process 800 may be performed to achieve at least part of operation 540 as described in connection with FIG. 5.
  • the process 800 may include one or more iterations.
  • the hypothetic elevation of the subject and/or the first characteristic value (s) of each cell described in connection with operation 510 may be updated.
  • a current iteration of the process 800 is described.
  • the current iteration may include one or more of the operations as shown in FIG. 8.
  • the processing engine 122 e.g., the absolute elevation determination module 430
  • the processing circuits of the processor 220 may determine a similarity degree between the physical points and the corresponding reference physical points in the cell in the current iteration.
  • the similarity degree corresponding to a cell may be configured to measure a difference or similarity between the physical points detected by the sensor assembled on the subject and the corresponding reference physical points stored in the location information database (e.g., an HD map) .
  • the processing device 122 may merely determine the first characteristic value (s) and the reference first characteristic value (s) of each cell to represent the absolute elevation of the cell. In this situation, the similarity degree corresponding to a cell in the current iteration may be determined based on one or more first characteristic values in the current iteration and the first reference characteristic value (s) of the cell. In some other embodiments, the processing device 122 may further determine the second characteristic value (s) and the second reference characteristic value (s) of the cell to represent one or more secondary features of each cell.
  • the similarity degree corresponding to a cell in the current iteration may be determined based on one or more first characteristic value (s) in the current iteration, the first reference characteristic value (s) , as well as the second characteristic value (s) , and the second reference characteristic value (s) .
  • the determination of the similarity degree corresponding to a cell in the current iteration based on the absolute elevation and the intensity is described as an example.
  • the first characteristic value (s) in the current iteration and the second characteristic value (s) may be used to construct a feature vector representing the cell in the current iteration.
  • the first reference characteristic value (s) and the second reference characteristic value (s) may be used to construct a reference feature vector representing the cell.
  • the similarity degree of the cell in the current iteration may be represented by, for example, a cosine similarity, a Euclidian distance, or any other parameter that can measure the similarity between the feature vector of the cell in the current iteration and the corresponding reference feature vector.
  • the processing device 122 may determine a cost function for a cell to measure the difference between the physical points and the reference physical points in the cell.
  • the processing device 122 may further determine the similarity degree of the cell based on the cost function. For example, a cost function and a similarity degree corresponding to an x th cell in the t th the iteration may be determined according to Equations (3) and (4) as below:
  • P xr refers a vector torepresenting a reference overall absolute elevation and a reference overall intensity of the x th cell
  • C x refers to a covariance matrix representing an absolute elevation distribution and an intensity distribution of the x th cell
  • C xr refers to a covariance matrix representing a reference absolute elevation distribution and a reference intensity distribution of the x th cell.
  • the vector may be denoted as [Z x +Z t , I x ] , wherein Z x refers to a preliminary third characteristic value representing the overall relative elevation of the x th cell, I x refers to a fifth characteristic value representing the overall intensity of the x th cell, and Z t refers to a hypothetic elevation of the subject in the current iteration. Details regarding the C x , Z x , and I x may be found elsewhere in the present disclosure (e.g., Equation (1) in FIG. 7 and the relevant descriptions thereof) .
  • the processing engine 122 e.g., the absolute elevation determination module 430
  • the processing circuits of the processor 220 may update the hypothetic elevation in the current iteration based on the similarity degrees of the plurality of cells in the current iteration.
  • the processing engine 122 may update the hypothetic elevation in the current iteration using on a particle filter technique.
  • the particle filter technique may utilize a set of particles (also referred to as samples) , each of which presents a hypothetic state of a system and has a weight assigned to the particle.
  • the weight of a particle may represent a probability that the particle is an accurate representation of an actual state of the system.
  • the particles may be updated (e.g., resampled) iteratively according to an observation of the system until a certain condition is met. The actual state of the system may then be determined based on the updated particles after the condition is met.
  • the processing device 122 may assign a plurality of particles to the cells in the surrounding environment.
  • Each particle may have an initial state and be assigned to one or more cells of the plurality of cells.
  • the initial state of a particle may represent a possible value of the absolute elevation of the subject.
  • the initial states assigned to the particles may be the same or different.
  • the particles may be assigned with the same initial state and each particle may represent the hypothetic elevation of the subject as described in connection with operation 510 (i.e., the initial hypothetic elevation before the iteration (s) of the process 800) .
  • the particles may be assigned with different initial states representing different possible values of the absolute elevation.
  • the initial sates of different particles may respectively represent an absolute elevation of a lane marking in the surrounding environment, an absolute elevation of a road sign in the surrounding environment, or the like.
  • the particles may be distributed to the cells uniformly or nonuniformly. For example, assuming that there are X cells in the surrounding environment, each cell may be assigned with a particle and each particle may be assigned to N of the X cells.
  • the processing device 122 may further determine a weight of each particle in the current iteration based on the similarity degrees of the cells in the current iteration. For example, the processing engine 122 may determine a weight of a j th particle in the t th iteration according to Equation (5) as below:
  • N refers to the total number (or count) of cells that the j th particle is assigned to
  • X refers to the total number (or count) of the cells in the surrounding environment
  • the processing engine 122 may update the hypothetic elevation in the current iteration based on the weights and states of the particles in the current iteration.
  • the updated hypothetic elevation may be determined according to Equation (6) as below:
  • M refers to the total number (or count) of the particles, refers to the state of the j th particle in the t th iteration.
  • the processing engine 122 may determine whether a termination condition is satisfied in the current iteration.
  • An exemplary termination condition may be that the difference between the hypothetic elevation and the updated hypothetic elevation in the current iteration is within a threshold, indicating the hypothetic elevation converges.
  • Other exemplary termination conditions may include that a certain iteration count of iterations are performed, that a difference between the particles in the current iteration and the particles in the previous iteration is within a threshold such that the particles of current iteration converges, an overall similarity degree (e.g., a mean similarity degree) corresponding to the cells exceeds a threshold, etc.
  • the process 800 may proceed to 870.
  • the processing engine 122 e.g., the absolute elevation determination module 430
  • the processing circuits of the processor 220 may designate the updated hypothetic elevation in the current iteration as the absolute elevation of the subject.
  • the process 800 may proceed to operations 840 to 860.
  • the processing engine 122 may update the first characteristic value (s) of the cell in the current iteration based on the updated hypothetic elevation.
  • the first characteristic value (s) of the cell may include a third characteristic value representing an overall absolute elevation of the cell and/or a fourth characteristic value representing an absolute elevation distribution of the cell.
  • the third characteristic value of the cell may need to be updated and the fourth characteristic value may be omitted from the updating.
  • the updated third characteristic value may be equal to a sum of the preliminary third characteristic value representing the overall relative elevation of the cell (e.g., the mean relative elevation of the cell) and the updated hypothetic elevation in the current iteration.
  • the processing engine 122 (e.g., the absolute elevation determination module 430) (e.g., the processing circuits of the processor 220) may designate the updated hypothetic elevation in the current iteration as the hypothetic elevation in a next iteration.
  • the processing engine 122 e.g., the absolute elevation determination module 430
  • the processing engine 122 e.g., the absolute elevation determination module 430
  • the process 800 may proceed to operation 810 again to perform the next iteration until the termination condition is satisfied.
  • the processing device 122 may update the particles in the current iteration based on the weights of the particles determined in the current iteration.
  • the processing device 122 may further designate the updated particles as the particles in the next iteration.
  • the processing device 122 may update the particles by resampling. For example, the processing device 122 may remove one or more particles if their weight (s) determined in the current iteration are smaller than a first threshold. As another example, the processing device 122 may replicate one or more particles if their weight (s) determined in the current iteration are greater than a second threshold.
  • the processing device 122 may update the particles by updating the state (s) of one or more particles. Merely by way of example, the processing device 122 may determine an updated possible value of the absolute elevation the subject as an updated state of a certain particle.
  • the process 800 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed.
  • the process 800 may further include an operation to store the absolute elevation and/or an operation to transmit the absolute elevation to a terminal device associated with the subject (e.g., a built-in computer of the vehicle 110) for presentation.
  • a terminal device associated with the subject e.g., a built-in computer of the vehicle 110
  • Operations 840 to 860 may be performed in any order.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in a combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2103, Perl, COBOL 2102, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may execute entirely on the user’s computer, partly on the user’s computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user’s computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service
  • the numbers expressing quantities or properties used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ”
  • “about, ” “approximate, ” or “substantially” may indicate ⁇ 20%variation of the value it describes, unless otherwise stated.
  • the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment.
  • the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.

Abstract

A system for determining an absolute elevation of a subject is provided. The subject may be located in a surrounding environment. For each cell of a plurality of cells in the surrounding environment, the system may obtain a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including a relative elevation of the physical point. For each cell, the system may determine at least one first characteristic value of the cell based on a hypothetic elevation of the subject and the relative elevations of the corresponding physical points, and obtain absolute elevations of reference physical points. The system may further determine the absolute elevation of the subject by updating the hypothetic elevation of the subject by comparing the physical points with the reference physical points in each cell.

Description

POSITIONING SYSTEMS AND METHODS TECHNICAL FIELD
This present disclosure generally relates to positioning systems and methods, and in particular, to systems and methods for determining an absolute elevation of a subject automatically, e.g., in an autonomous driving context.
BACKGROUND
Positioning technologies are widely used in various fields, such as navigation systems, e.g., navigation for autonomous driving systems. For an autonomous driving system, it is important to determine a precise position of a subject (e.g., the autonomous vehicle) in a coordinate system (e.g., the standard coordinate system of the Earth) . The position of the subject may be represented by, for example, a longitude, a latitude, and an absolute elevation of the subject in the coordinate system. Normally, the position of the subject may be determined based on a point-cloud data acquired by one or more sensors (e.g., a LiDAR device) mounted on the subject. However, the point-cloud data may include limited information that can be used to determine the absolute elevation of the subject in the coordinate system. Therefore, it is desirable to provide effective systems and methods for determining the absolute elevation of the subject in the coordinate system, thus improving positioning accuracy and efficiency.
SUMMARY
According to an aspect of the present disclosure, a system may be provided. The system may be configured for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment. The system may include at least one storage medium including a set of instructions, and at least one processor in communication with the at least one storage medium. When executing the instructions, the at least one processor may be configured to direct the system to perform following operations. For each cell of a plurality of cells  in the surrounding environment, the system may obtain a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point. The at least one feature value may be acquired by a sensor assembled on the subject. The at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor. For each cell of a plurality of cells in the surrounding environment, the system may further determine at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor. The at least one first characteristic value may represent absolute elevations of the corresponding physical points in the coordinate system. For each cell of a plurality of cells in the surrounding environment, the system may obtain at least one first reference characteristic value of the cell based on a location information database. The at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell in the coordinate system. The system may also determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system. The updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
In some embodiments, to obtain the set of data points representative of each cell, the system may obtain the data points representative of the surrounding environment, and divide the data points representative of the surrounding environment into the plurality of sets of data points corresponding to the plurality of cells of the surrounding environment.
In some embodiments, the at least one feature of the physical point further may include a secondary feature of the physical point. For each cell, the system may further determine at least one second characteristic value of the cell based on the feature values of the secondary feature of the corresponding physical points.  The at least one second characteristic value may represent the secondary feature of the corresponding physical points. For each cell, the system may also obtain at least one second reference characteristic value of the cell based on the location information database. The at least one second reference characteristic value represents the secondary feature of the reference physical points in the cell. The comparing the physical points with the reference physical points in each cell may be further based on the at least one second characteristic value of the each cell and the at least one second reference characteristic value of the each cell.
In some embodiments, the secondary feature may include at least one of an intensity, a coordinate in a reference coordinate system associated with the sensor, a classification, or a scan direction.
In some embodiments, the secondary feature may be an intensity. For each cell, the at least one second characteristic value of the cell may include a characteristic value indicating an overall intensity of the cell and a characteristic value indicating an intensity distribution of the cell.
In some embodiments, for each cell, the at least one first characteristic value may include a third characteristic value indicating an overall absolute elevation of the cell and a fourth characteristic value indicating an absolute elevation distribution of the cell. The system may determine a preliminary third characteristic value and a preliminary fourth characteristic value of the cell based on the relative elevations of the corresponding physical points. The preliminary third characteristic value may indicate an overall relative elevation of the cell. The preliminary fourth characteristic value may indicate a relative elevation distribution of the cell. The system may also determine the third characteristic value based on the preliminary third characteristic value and the hypothetic elevation, and designate the preliminary fourth characteristic value as the fourth characteristic value.
In some embodiments, the preliminary third characteristic value of the cell may be a mean value of the relative elevations of the corresponding physical points. The preliminary fourth characteristic value of the cell may be a covariance of the relative elevations of the corresponding physical points.
In some embodiments, the determining the absolute elevation of the subject in the coordinate system includes one or more iterations. For each cell, each current iteration may include determining a similarity degree between the corresponding physical points and the corresponding reference physical points in the current iteration based on the at least one first characteristic value of the cell in the current iteration and the at least one first reference characteristic value of the cell. For each cell, each current iteration may also include updating the hypothetic elevation in the current iteration based on the similarity degrees of the plurality of cells in the current iteration, and determining whether a termination condition is satisfied in the current iteration. For each cell, in response to a determination that the termination condition is satisfied, each current iteration may further include designating the updated hypothetic elevation in the current iteration as the absolute elevation.
In some embodiments, in response to a determination that the termination condition is not satisfied, each current iteration may also include updating the at least one first characteristic value of each cell in the current iteration based on the updated hypothetic elevation in the current iteration, designating the updated hypothetic elevation in the current iteration as the hypothetic elevation in a next iteration, and designating the updated first characteristic value of each cell in the current iteration as the first characteristic value of the cell in the next iteration.
In some embodiments, the determining the absolute elevation of the subject in the coordinate system may be based on a particle filtering technique.
In some embodiments, the determining the absolute elevation of the subject in the coordinate system may further include assigning a plurality of particles to the plurality of cells, each of the plurality of particles having a state and being assigned to one or more cells of the plurality of cells. The updating the hypothetic elevation in the current iteration may include determining a weight for each particle of the plurality of particles in the current iteration based on the similarity degrees of the plurality of cells in the current iteration, and determining the updated hypothetic  elevation in the current iteration based on the weights and the states of the plurality of particles in the current iteration.
In some embodiments, the updating the hypothetic elevation in the current iteration may further include updating the plurality of particles in the current iteration based on the weights of the plurality of particles in the current iteration, and designating the updated particles in the current iteration as the plurality of particles in the next iteration.
In some embodiments, for each cell, to obtain the at least one first reference characteristic value representing the absolute elevations of the plurality of reference physical points in the cell, the system may obtain a reference set of reference data points corresponding to the cell from the location information database. Each reference data point in the reference set may represent a reference physical point in the cell and include an absolute elevation of the reference physical point in the coordinate system. The system may also determine the at least one first reference characteristic value of the cell based on the corresponding reference set.
In some embodiments, the system may further obtain one or more reference absolute elevations of one or more reference objects based on the estimated location of the subject based on the location information database, and determine the hypothetic elevation of the subject in the coordinate system based on the one or more reference absolute elevations.
According to another aspect of the present disclosure, a method is provided. The method may be configured for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment. The method may include following operations. For each cell of a plurality of cells in the surrounding environment, the method may include obtaining a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point. The at least one feature value may be acquired by a sensor assembled on the subject. The at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor. For each cell of a plurality  of cells in the surrounding environment, the method may also include determining at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor. The at least one first characteristic value may represent absolute elevations of the corresponding physical points in the coordinate system. For each cell of a plurality of cells in the surrounding environment, the method may further include obtaining at least one first reference characteristic value of the cell based on a location information database. The at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell in the coordinate system. The method may further include determining the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system. The updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
According to a further aspect of the present disclosure, a non-transitory computer readable medium embodying a computer program product is provided. The computer program product may include instructions for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment. The instructions may be configured to cause a computing device to perform following operations. For each cell of a plurality of cells in the surrounding environment, the computing device may obtain a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point. The at least one feature value may be acquired by a sensor assembled on the subject. The at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor. For each cell of a plurality of cells in the surrounding environment, the computing device may also determine at least one first characteristic value of the cell based on a  hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor. The at least one first characteristic value may represent absolute elevations of the corresponding physical points in the coordinate system. For each cell of a plurality of cells in the surrounding environment, the computing device may further obtain at least one first reference characteristic value of the cell based on a location information database. The at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell in the coordinate system. The computing device may also determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system. The updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
According to a still further aspect of the present disclosure, a system is provided. The system may be configured for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment. The system may include an obtaining module, a characteristic determination module, and an absolute elevation determination module. The obtaining module may be configured to, for each cell of a plurality of cells in the surrounding environment, obtain a set of data points representative of the cell. Each data point in the set may represent a physical point in the cell and include at least one feature value of at least one feature of the physical point. The at least one feature value may be acquired by a sensor assembled on the subject. The at least one feature of the physical point may include a relative elevation of the physical point with respect to the sensor. The characteristic determination module may be configured to, for each cell, determine at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor. The at least one first characteristic value may represent absolute elevations of the  corresponding physical points in the coordinate system. The obtaining module may be further configured to for each cell, obtain at least one first reference characteristic value of the cell based on a location information database. The at least one first reference characteristic value may represent absolute elevations of a plurality of reference physical points in the cell. The absolute elevation determination module may be configured to determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system. The updating the hypothetic elevation of the subject may include comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
Additional features will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The features of the present disclosure may be realized and attained by practice or use of various aspects of the methodologies, instrumentalities and combinations set forth in the detailed examples discussed below.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
FIG. 1 is a schematic diagram illustrating an exemplary autonomous driving system according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device according to some embodiments of the present disclosure;
FIG. 4 is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure;
FIG. 5 is a flowchart illustrating an exemplary process for determining an absolute elevation of a subject in a coordinate system according to some embodiments of the present disclosure;
FIG. 6 is a flowchart illustrating an exemplary process for obtaining a plurality of sets of data points corresponding to a plurality of cells in a surrounding environment of a subject according to some embodiments of the present disclosure;
FIG. 7 is a flowchart illustrating an exemplary process for determining one or more first characteristic values of a cell according to some embodiments of the present disclosure; and
FIG. 8 is a schematic diagram illustrating an exemplary process for determining an absolute elevation of a subject according to some embodiments of the present disclosure.
DETAILED DESCRIPTION
The following description is presented to enable any person skilled in the art to make and use the present disclosure, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present disclosure is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a, ” “an, ” and “the” may be intended to include the plural forms as  well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprise, ” “comprises, ” and/or “comprising, ” “include, ” “includes, ” and/or “including, ” when used in this disclosure, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
These and other features, and characteristics of the present disclosure, as well as the methods of operations and functions of the related elements of structure and the combination of parts and economies of manufacture, may become more apparent upon consideration of the following description with reference to the accompanying drawings, all of which form part of this disclosure. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended to limit the scope of the present disclosure. It is understood that the drawings are not to scale.
The flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
Moreover, while the systems and methods disclosed in the present disclosure are described primarily regarding determining an absolute elevation of a subject (e.g., an autonomous vehicle) in an autonomous driving system. It should be understood that this is only one exemplary embodiment. The systems and methods of the present disclosure may be applied to any other kind of transportation system. For example, the systems and methods of the present disclosure may be applied to transportation systems of different environments including land, ocean, aerospace, or the like, or any combination thereof. The vehicle of the transportation systems may include a taxi, a private car, a hitch, a bus, a train, a bullet train, a high- speed rail, a subway, a vessel, an aircraft, a spaceship, a hot-air balloon, or the like, or any combination thereof.
An aspect of the present disclosure relates to systems and methods for determining an absolute elevation of a subject in a coordinate system (e.g., a standard coordinate system for the Earth) . As used herein, the absolute elevation of the subject in the coordinate system may refer to a height of the subject above or below a fixed reference point, line, or plane defined by the coordinate system. To this end, the systems and methods may obtain a plurality of sets of data points, each of which represents a cell in a surrounding environment of the subject. The sets of data points may be acquired by a sensor (e.g., a LiDAR device) mounted on the subject. Each data point in a set representative of a cell may be associated with a physical point in the cell and include one or more feature values of one or more features of the physical point. The feature (s) of a physical point may include a relative elevation of the physical point with respect to the sensor. For each cell, the systems and methods may determine one or more first characteristic values representing absolute elevations of the physical points in the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the physical points in the cell. For each cell, the systems and methods may also obtain or determine one or more first reference characteristic values representing absolute elevations of a plurality of reference physical points in the cell based on a location information database (e.g., a pre-built high-definition (HD) map covering the surrounding environment) . The systems and methods may further determine the absolute elevation of the subject by updating the hypothetic elevation of the subject. The update of the hypothetic elevation may include a comparison between the physical points and the reference physical points in each cell according to the first characteristic value (s) and the first reference characteristic value (s) of each cell.
According to some embodiments of the present disclosure, the absolute elevation of the subject may be determined based on a comparison of the physical points in the surrounding environment detected by the sensor and the corresponding  reference physical points stored in the location information database. The comparison may be performed for each cell in a parallel manner instead of for the whole surrounding environment, which may improve computational efficiency and reduce processing time. In addition, the physical points in a cell may be compared with the corresponding reference physical points according to one or more characteristic values and one or more reference characteristic values representing one or more features of the cell. This may be more efficient compared with comparing the physical points with the reference physical points in the cell directly according to feature values of the physical points and the corresponding reference physical points.
FIG. 1 is a schematic diagram illustrating an exemplary autonomous driving system 100 according to some embodiments of the present disclosure. In some embodiments, the autonomous driving system 100 may include one or more vehicles 110 (vehicles 110-1, 110-2…110-n) , a server 120, a terminal device 130, a storage device 140, a network 150, and a navigation system 160 (also referred to as a positioning system) .
vehicle 110 may carry a passenger to travel to a destination. In some embodiments, the vehicle 110 may be an autonomous vehicle. The autonomous vehicle may refer to a vehicle that is capable of achieving a certain level of driving automation. Exemplary levels of driving automation may include a first level at which the vehicle is mainly supervised by a human and has a specific autonomous function (e.g., autonomous steering or accelerating) , a second level at which the vehicle has one or more advanced driver assistance systems (ADAS) (e.g., an adaptive cruise control system, a lane-keep system) that can control the braking, steering, and/or acceleration of the vehicle, a third level at which the vehicle is able to drive autonomously when one or more certain conditions are met, a fourth level at which the vehicle can operate without human input or oversight but still is subject to some constraints (e.g., be confined to a certain area) , a fifth level at which the vehicle can operate autonomously under all circumstances, or the like, or any combination thereof.
In some embodiments, the vehicle 110 may be an electric vehicle, a fuel cell vehicle, a hybrid vehicle, a conventional internal combustion engine vehicle, or any other type of vehicle. The vehicle 110 may be a sports vehicle, a coupe, a sedan, a pick-up truck, a station wagon, a sports utility vehicle (SUV) , a minivan, a conversion van, or have any other style. The vehicle 110 may include one or more similar components as a conventional vehicle, for example, a chassis, a suspension, a steering device (e.g., a steering wheel) , a brake device (e.g., a brake pedal) , an accelerator, etc. Merely by way of example, the vehicle 110 may have a body and at least one wheel, e.g., a pair of front wheels and a pair of rear wheels. The vehicle 110 may be all-wheel drive (AWD) , front wheel drive (FWR) , or rear wheel drive (RWD) . In some embodiments, the vehicle 110 may be operated by an operator occupying the vehicle, under a remote control, and/or autonomously. In some embodiments, the vehicle 110 may be a survey vehicle configured to acquire data for constructing an HD map or three-dimensional (3D) city model.
In some embodiments, as illustrated in FIG. 1, the vehicle110 may be equipped with one or more sensors 112 such that the vehicle 110 is capable of sensing its surrounding environment. The sensor (s) 112 may be mounted on the vehicle 110 using any suitable mounting mechanism. The mounting mechanism may be an electro-mechanical device installed or otherwise attached to the body of the vehicle 110. For example, the mounting mechanism may use one or more screws, adhesives, or another mounting mechanism. The sensor (s) 112 may be mounted on any position of the vehicle 110, for example, inside or outside the body of the vehicle.
The sensor (s) 112 of the vehicle 110 may include any sensor that is capable of collecting information related to the surrounding environment of the vehicle 110. For example, the sensor (s) 112 may include a light detection and ranging (LiDAR) device, a camera, a GPS device, an inertial measurement unit (IMU) sensor, a radar, a sonar, or the like, or any combination thereof. The LiDAR device may be configured to scan the surrounding environment and acquire point-cloud data representative of the surrounding environment. For example, the LiDAR device  may measure a distance to an object by illuminating the object with light pulses and measuring the reflected pulses. Differences in light return times and wavelengths may then be used to construct a 3D representation of the object. The light pulses used by the LiDAR device may be ultraviolet, visible, near infrared, etc. The camera may be configured to obtain one or more images relating to objects (e.g., a person, an animal, a tree, a roadblock, a building, or a vehicle) that are within the scope of the camera. The GPS device may be configured to receive geo-location and time information from GPS satellites and then determine the device's geographical position. The IMU sensor may be configured to measure and provide the vehicle’s specific force, angular rate, and sometimes the magnetic field surrounding the vehicle, using one or more inertial sensors, such as an accelerometer and a gyroscope.
In some embodiments, the server 120 may be a single server or a server group. The server group may be centralized or distributed (e.g., the server 120 may be a distributed system) . In some embodiments, the server 120 may be local or remote. For example, the server 120 may access information and/or data stored in the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, and/or the navigation system 160 via the network 150. As another example, the server 120 may be directly connected to the terminal device 130, the sensor (s) 112, the vehicle 110, and/or the storage device 140 to access stored information and/or data. In some embodiments, the server 120 may be implemented on a cloud platform or an onboard computer. Merely by way of example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof. In some embodiments, the server 120 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2 in the present disclosure.
In some embodiments, the server 120 may include a processing device 122. The processing device 122 may process information and/or data associated with the autonomous driving system 100 to perform one or more functions described in the  present disclosure. For example, the processing device 122 may determine a position (e.g., an absolute elevation) of the vehicle 110 in a coordinate system (e.g., a world geodetic system) according to data associated with a surrounding environment collected by the sensor (s) 112. Particularly, in certain embodiments, the sensor (s) 112 may continuously or intermittently (e.g., periodically or irregularly) collect data associated with the surrounding environment when the vehicle 110 moves. The processing device 122 may determine the position of the vehicle 110 in the coordinate system in real-time or intermittently (e.g., periodically or irregularly) .
In some embodiments, the processing device 122 may include one or more processing devices (e.g., single-core processing device (s) or multi-core processor (s) ) . Merely by way of example, the processing device 122 may include a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a digital signal processor (DSP) , a field programmable gate array (FPGA) , a programmable logic device (PLD) , a controller, a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
In some embodiments, the server 120 may be connected to the network 150 to communicate with one or more components (e.g., the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, and/or the navigation system 160) of the autonomous driving system 100. In some embodiments, the server 120 may be directly connected to or communicate with one or more components (e.g., the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, and/or the navigation system 160) of the autonomous driving system 100. In some embodiments, the server 120 may be integrated into the vehicle 110. For example, the server 120 may be a computing device (e.g., a computer) installed in the vehicle 110.
In some embodiments, the terminal device 130 may enable a user interaction between a user (e.g., a driver of the vehicle 110) and one or more components of the autonomous driving system 100. The terminal device 130 may  include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, a built-in device in a vehicle 130-4, a smart watch 130-5, or the like, or any combination thereof. In some embodiments, the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof. For example, the virtual reality device and/or the augmented reality device may include a Google TM Glass, an Oculus Rift, a HoloLens, a Gear VR, etc. In some embodiments, the built-in device in the vehicle 130-4 may include an onboard computer, an onboard television, etc. In some embodiments, the server 120 may be integrated into the terminal device 130. In some embodiments, the terminal device 130 may be integrated into the vehicle 110.
The storage device 140 may store data and/or instructions. In some embodiments, the storage device 140 may store data obtained from the terminal device 130, the sensor (s) 112, the vehicle 110, the navigation system 160, the processing device 122, and/or an external storage device. For example, the storage device 140 may store an HD map of an area (e.g., a country, a city, a street) and/or one or more characteristic values of the area (e.g., a mean absolute elevation of physical points in the area) . In some embodiments, the storage device 140 may  store data and/or instructions that the server 120 may execute or use to perform exemplary methods described in the present disclosure. For example, the storage device 140 may store instructions that the processing device 122 may execute or use to determine an absolute elevation of the vehicle 110 in a coordinate system (e.g., a world geodetic system) .
In some embodiments, the storage device 140 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary mass storage devices may include a magnetic disk, an optical disk, a solid-state drive, etc. Exemplary removable storage devices may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc. Exemplary volatile read-and-write memory may include a random access memory (RAM) . Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc. Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically-erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc. In some embodiments, the storage device 140 may be implemented on a cloud platform. Merely by way of example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
In some embodiments, the storage device 140 may be connected to the network 150 to communicate with one or more components (e.g., the server 120, the terminal device 130, the sensor (s) 112, the vehicle 110, and/or the navigation system 160) of the autonomous driving system 100. One or more components of the autonomous driving system 100 may access the data or instructions stored in the storage device 140 via the network 150. In some embodiments, the storage device 140 may be directly connected to or communicate with one or more components (e.g., the server 120, the terminal device 130, the sensor (s) 112, the vehicle 110,  and/or the navigation system 160) of the autonomous driving system 100. In some embodiments, the storage device 140 may be part of the server 120. In some embodiments, the storage device 140 may be integrated into the vehicle 110.
The network 150 may facilitate the exchange of information and/or data. In some embodiments, one or more components (e.g., the server 120, the terminal device 130, the sensor (s) 112, the vehicle 110, the storage device 140, or the navigation system 160) of the autonomous driving system 100 may send information and/or data to other component (s) of the autonomous driving system 100 via the network 150. For example, the server 120 may obtain data related to a surrounding environment of the vehicle 110 from the sensor (s) 112 mounted on the vehicle 110 via the network 150. In some embodiments, the network 150 may be any type of wired or wireless network, or combination thereof. Merely by way of example, the network 150 may include a cable network, a wireline network, an optical fiber network, a tele communications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a wide area network (WAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a near field communication (NFC) network, or the like, or any combination thereof. In some embodiments, the network 150 may include one or more network access points. For example, the network 150 may include wired or wireless network access points (e.g., 150-1, 150-2) , through which one or more components of the autonomous driving system 100 may be connected to the network 150 to exchange data and/or information.
The navigation system 160 may determine information associated with an object, for example, one or more terminal devices 130, the vehicle 110, etc. In some embodiments, the navigation system 160 may be a global positioning system (GPS) , a global navigation satellite system (GLONASS) , a compass navigation system (COMPASS) , a BeiDou navigation satellite system, a Galileo positioning system, a quasi-zenith satellite system (QZSS) , etc. The information may include a location, an elevation, a velocity, or an acceleration of the object, or a current time.  The navigation system 160 may include one or more satellites, for example, a satellite 160-1, a satellite 160-2, and a satellite 160-3. The satellites 160-1 through 160-3 may determine the information mentioned above independently or jointly. The navigation system 160 may send the information mentioned above to the network 150, the terminal device 130, or the vehicle 110 via wireless connections.
It should be noted that the autonomous driving system 100 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations or modifications may be made under the teachings of the present disclosure. For example, the autonomous driving system 100 may further include one or more additional components, such as an information source, a location information database (which may be an independent part of the autonomous driving system 100 or integrated into the storage device 140) . As another example, one or more components of the autonomous driving system 100 may be omitted or be replaced by one or more other devices that can realize similar functions. In some embodiments, a GPS device of a vehicle 110 may be replaced by another positioning device, such as BeiDou. However, those variations and modifications do not depart from the scope of the present disclosure.
FIG. 2 is a schematic diagram illustrating exemplary hardware and software components of a computing device 200 according to some embodiments of the present disclosure. The computing device 200 may be used to implement any component of the autonomous driving system 100 as described herein. For example, the server 120 (e.g., the processing device 122) and/or the terminal device 130 may be implemented on the computing device 200, via its hardware, software program, firmware, or a combination thereof. Although only one such computing device is shown, for convenience, the computer functions relating to the autonomous driving system 100 as described herein may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load.
As illustrated in FIG. 2, the computing device 200 may include a communication bus 210, a processor 220, a storage device, an input/output (I/O)  260, and a communication port 250. The processor 220 may execute computer instructions (e.g., program code) and perform functions of one or more components of the autonomous driving system 100 in accordance with techniques described herein. For example, the processor 220 may determine an absolute elevation of the vehicle 110 in a coordinate system (e.g., a world geodetic system) . The computer instructions may include, for example, routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions described herein. In some embodiments, the processor 220 may include interface circuits and processing circuits therein. The interface circuits may be configured to receive electronic signals from the communication bus 210, wherein the electronic signals encode structured data and/or instructions for the processing circuits to process. The processing circuits may conduct logic calculations, and then determine a conclusion, a result, and/or an instruction encoded as electronic signals. Then the interface circuits may send out the electronic signals from the processing circuits via the communication bus 210.
In some embodiments, the processor 220 may include one or more hardware processors, such as a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
Merely for illustration, only one processor 220 is described in the computing device 200. However, it should be noted that the computing device 200 in the present disclosure may also include multiple processors, thus operations and/or method steps that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors. For example, if in the present disclosure the processor of the computing device 200  executes both step A and step B, it should be understood that step A and step B may also be performed by two or more different processors jointly or separately in the computing device 200 (e.g., a first processor executes step A and a second processor executes step B, or the first and second processors jointly execute steps A and B) .
The storage device may store data/information related to the autonomous driving system 100. In some embodiments, the storage device may include a mass storage device, a removable storage device, a volatile read-and-write memory, a random access memory (RAM) 240, a read-only memory (ROM) 230, a disk 270, or the like, or any combination thereof. In some embodiments, the storage device may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure. For example, the storage device may store a program for the processor 220 to execute.
The I/O 260 may input and/or output signals, data, information, etc. In some embodiments, the I/O 260 may enable a user interaction with the computing device 200. In some embodiments, the I/O 260 may include an input device and an output device. Examples of the input device may include a keyboard, a mouse, a touch screen, a microphone, or the like, or a combination thereof. Examples of the output device may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof. Examples of the display device may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , a touch screen, or the like, or a combination thereof.
The communication port 250 may be connected to a network (e.g., the network 120) to facilitate data communications. The communication port 250 may establish connections between the computing device 200 and one or more components of the autonomous driving system 100. The connection may be a wired connection, a wireless connection, any other communication connection that can enable data transmission and/or reception, and/or any combination of these connections. The wired connection may include, for example, an electrical cable,  an optical cable, a telephone wire, or the like, or any combination thereof. The wireless connection may include, for example, a Bluetooth TM link, a Wi-Fi TM link, a WiMax TM link, a WLAN link, a ZigBee link, a mobile network link (e.g., 3G, 4G, 5G, etc. ) , or the like, or a combination thereof. In some embodiments, the communication port 250 may be and/or include a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 250 may be a specially designed communication port.
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device 300 according to some embodiments of the present disclosure. In some embodiments, one or more components (e.g., the processing device 122 and/or the terminal device 130) may be implemented on one or more components of the mobile device 300.
As illustrated in FIG. 3, the mobile device 300 may include a communication platform 310, a display 320, a graphics processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and storage 390. In some embodiments, any other suitable component, including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300. In some embodiments, a mobile operating system 370 (e.g., iOS TM, Android TM, Windows Phone TM) and one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340. The applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to positioning or other information from the processing device 122. User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device 122 and/or other components of the autonomous driving system 100 via the network 150.
To implement various modules, units, and their functionalities described in the present disclosure, computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein. A computer with user interface elements may be used to implement a personal computer (PC) or any other  type of work station or terminal device. A computer may also act as a server if appropriately programmed.
FIG. 4 is a block diagram illustrating an exemplary processing device 122 according to some embodiments of the present disclosure. The processing device 122 may include an obtaining module 410, a characteristic determination module 420, and an absolute elevation determination module 430.
The obtaining module 410 may obtain information related to the autonomous driving system 100. For example, the obtaining module may obtain point-cloud data representative of a surrounding environment of a subject, a plurality of sets of data points representing a plurality of cells in the surrounding environment, one or more reference characteristic values of the cells, an estimated location of the subject, or the like, or any combination thereof. Details regarding the information obtained by the obtaining module 410 may be found elsewhere in the present disclosure (e.g., FIG. 5 and the relevant descriptions thereof) .
The characteristic determination module 420 may determine one or more characteristic values and/or one or more reference characteristic values of a cell. The characteristic value (s) of the cell may include, for example, one or more first characteristic values representing absolute elevations of a plurality of physical points in the cell, one or more second characteristic values representing a secondary feature of the physical points in the cell, or the like, or any combination thereof. The reference characteristic value (s) of the cell may include, for example, one or more first reference characteristic values representing absolute elevations of a plurality of reference physical points in the cell, one or more second reference characteristic values representing a secondary feature of the reference physical points in the cell, or the like, or any combination thereof. Details regarding the characteristic value (s) and/or the reference characteristic value (s) may be found elsewhere in the present disclosure (e.g., FIG. 5 and the relevant descriptions thereof) .
The absolute elevation determination module 430 may be configured to determine an absolute elevation of the subject in a coordinate system. In some embodiments, the absolute elevation determination module 430 may determine the  absolute elevation by updating a hypothetic elevation of the subject in the coordinate system. Details regarding the determination of the absolute elevation may be found elsewhere in the present disclosure (e.g., operation 540 and FIG. 7 and the relevant descriptions thereof) .
The modules in the processing device 122 may be connected to or communicate with each other via a wired connection or a wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof. The wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof. Two or more of the modules may be combined into a single module, and any one of the modules may be divided into two or more units. In some embodiments, one or more modules described above may be omitted. Additionally or alternatively, the processing device 122 may include one or more additional modules. For example, the processing device 122 may further include a storage module (not shown in FIG. 4) . In some embodiments, one or more modules may be combined into a single module. For example, the characteristic determination module 420 and the absolute elevation determination module 430 may be combined into a single module.
FIG. 5 is a flowchart illustrating an exemplary process for determining an absolute elevation of a subject in a coordinate system according to some embodiments of the present disclosure. The process 500 may be executed by the autonomous driving system 100. For example, the process 500 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240. The processing device 122 (e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 122 may be configured to perform the process 500.
As used herein, the subject may refer to any composition of organic and/or inorganic matters that are with or without life and located on earth. For example,  the subject may be any vehicle (e.g., car, boat, or aircraft) or any person. In certain embodiments, the subject may be an autonomous vehicle (e.g., the vehicle 110) as described elsewhere in the present disclosure (e.g., FIG. 1 and the relevant descriptions) . The absolute elevation of the subject in the coordinate system may refer to a height of the subject above or below a fixed reference point, line, or plane defined by the coordinate system. For example, the coordinate system may be a standard coordinate system for the Earth that defines a sea level or a geoid of the earth, and the absolute elevation of the subject in the coordinate system may refer to a height of the subject above or below the sea level or the geoid of the earth. For brevity, the absolute elevation of an entity (e.g., the subject) in the coordinate system is referred to as the absolute elevation of the entity herein. In some embodiments, the subject may have a plurality of physical points on or within the subject. The absolute elevation of the subject may be an absolute elevation of any physical point of the subject. Alternatively, the absolute elevation of the subject may be a statistic value (e.g., a mean value) of the absolute elevations of all or a portion of the physical points of the subject. Merely by way of example, the subject may be an autonomous vehicle having a LiDAR device. The absolute elevation of the autonomous vehicle may be an absolute elevation of the LiDAR device or a mean absolute elevation of the physical points on the autonomous vehicle.
In 510, the processing device 122 (e.g., the obtaining module 410) (e.g., the interface circuits of the processor 220) may obtain a plurality of sets of data points, each of which represents a cell in a surrounding environment of the subject.
As used herein, the surrounding environment of the subject may refer to the circumstances and one or more objects (including living and non-living objects) surrounding the subject. The surrounding environment may cover an area having any size and shape. In some embodiments, the area covered by the surrounding environment may be associated with the performance of the sensor assembled on the subject. Taking an autonomous vehicle moving on a lane as an example, a surrounding environment of the autonomous vehicle may include one or more objects around the autonomous vehicle, such as a ground surface, a lane marking, a  building, a pedestrian, an animal, a plant, one or more other vehicles, or the like. The size of an area covered by the surrounding environment of the autonomous vehicle may depend (or partially depend) on a scanning range of a LiDAR device assembled on the autonomous vehicle.
A cell of the surrounding environment may correspond to a subspace of the surrounding environment and include a plurality of physical points in the corresponding subspace, such as physical points on the body surface of an object located in the corresponding subspace. In some embodiments, the cell is a virtual concept. The set of data points representative of the cell may include a plurality of data points, each of which represents a physical point in the cell (i.e., in the corresponding subspace) and include at least one feature value of at least one feature of the physical point. Exemplary features of a physical point may include a relative position of the physical point with respect to the sensor (or the subject) , an intensity of the physical point, a classification of the physical point, a scan direction associated with the physical point, or the like, or any combination thereof.
In certain embodiments, the relative position of the physical point with respect to the sensor (or the subject) may be denoted as a coordinate of the physical point in a reference coordinate system associated with the sensor (or the subject) . The reference coordinate system associated with the sensor (or the subject) may be any suitable coordinate system whose origin is located at the sensor (or the subject) . Merely by way of example, the reference coordinate system may be a 3-dimensional coordinate system having an X-axis, a Y-axis, and a Z-axis. The Y-axis may be parallel with an orientation of the subject. The X-axis may be perpendicular to the Y-axis and form an X-Y plane that is parallel or roughly parallel with a ground surface or a bottom surface of the subject. The Z-axis may be perpendicular to the X-Y plane. The coordinate of the physical point in the reference coordinate system may include one or more of an X-coordinate on the X-axis, a Y-coordinate on the Y-axis, and a Z-coordinate on the X-axis, wherein the Z-coordinate may also be referred to as the relative elevation of the physical point with respect to the sensor. The intensity of the physical point may refer to a strength of returned laser pulse (s) that  are reflected by the physical point. The intensity of the physical point may be associated with a property (e.g., the composition and/or material) of the physical point. The classification of the physical point may refer to a type of an object that the physical point belongs. The scan direction associated with the physical point may refer to the direction in which a scanning mirror of the sensor was directed to when the corresponding data point was detected by the sensor.
In some embodiments, the at least one feature of a physical point recorded by a corresponding data point may be the relative elevation of the physical point with respect to the sensor. Alternatively, the at least one feature of the physical point may include the relative elevation and one or more secondary features of the physical point. As used herein, a secondary feature may refer to any feature other than the relative elevation. For example, the secondary feature may be the intensity, the X-coordinate in the reference coordinate system, the Y-coordinate in the reference coordinate system, the classification, the scan direction, or any other feature that can be measured by the sensor.
In some embodiments, the sets of data points representative of the cells may be acquired by a sensor (e.g., the sensor (s) 112) assembled on the subject. For example, the sensor may include one or more LiDAR devices as described in connection with FIG. 1. The sensor may be configured to emit laser pulses to scan the surrounding environment to collect data points representative of physical points in the surrounding environment (also referred to as point-cloud data representative of the surrounding environment herein) . The processing device 122 may obtain the point-cloud data from the sensor or another source that stores the point-cloud data. The processing device 122 may further divide the point-cloud data into the sets of data points corresponding to the cells. Details regarding the obtaining or determination of the sets of data points may be found elsewhere in the present disclosure (e.g., FIG. 6 and the relevant descriptions thereof) .
In 520, for each cell, the processing engine 122 (e.g., the characteristic determination module 420) (e.g., the processing circuits of the processor 220) may determine one or more first characteristic values and one or more second  characteristic values of the cell. The first characteristic value (s) of a cell may represent absolute elevations of the corresponding physical points. The second characteristic value (s) of a cell may represent a secondary feature of the corresponding physical points. The secondary feature may be or include the intensity, the X-coordinate and/or Y-coordinate in the reference coordinate system, the classification, the scan direction, or the like, or any combination thereof.
For illustration purposes, the intensity is described as an example of the secondary feature. The following descriptions are provided with reference to the determination of one or more first characteristic values representing the absolute elevation and one or more second characteristic values representing the intensity for a cell. It should be understood that this is only one exemplary embodiment. The secondary feature may include any feature (s) other than the intensity.
In some embodiments, the first characteristic value (s) representing the absolute elevation may include a third characteristic value indicating an overall absolute elevation of the cell (i.e., an overall level of the absolute elevations of the corresponding physical points) . Additionally or alternatively, the first characteristic value (s) of the cell may include a fourth characteristic value indicating an absolute elevation distribution of the cell (i.e., a distribution of the absolute elevations of the corresponding physical points) . In certain embodiments, the third characteristic value may be a mean absolute elevation, a median absolute elevation, or any other parameter that can reflect the overall absolute elevation of the cell. The fourth characteristic value may be a covariance, a variance, a standard deviation of the absolute elevations of the physical points in the cell, or any other parameter that can reflect the absolute elevation distribution of the cell.
Similarly, the second characteristic value (s) representing the intensity may include a fifth characteristic value indicating an overall intensity of the cell (i.e., an overall level of the intensities of the corresponding physical points) . Additionally or alternatively, the second characteristic value (s) representing the intensity may include a sixth characteristic value indicating an intensity distribution of the cell (i.e., a distribution of the intensities of the corresponding physical points) . In certain  embodiments, the fifth characteristic value may be a mean intensity, a median intensity, or any other parameter that can reflect the overall intensity of the cell. The sixth characteristic value may be a covariance, a variance, a standard deviation of the intensities of the physical points in the cell, or any other parameter that can reflect the intensity distribution of the cell.
In some embodiments, the first characteristic value (s) of the cell may be determined based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points. As used herein, the hypothetic elevation of the subject may refer to a hypothetic value of the absolute elevation of the subject in the coordinate system. In certain embodiments, the processing device 122 may determine one or more characteristic values representing the relative elevations of the physical points in the cell. The processing device 122 may further determine the first characteristic value (s) of the cell based on the hypothetic elevation and the characteristic value (s) representing the relative elevations of the physical points in the cell.
In some embodiments, the second characteristic value (s) of the cell may be determined based on the feature values of the secondary feature of the physical points in the cell. In some embodiments, the processing device 122 may utilize one or more vectors or matrixes to represent the first and the second characteristic value (s) of the cell. For example, a feature vector may be constructed based on the third characteristic value and the fifth characteristic value to represent the overall absolute elevation and the overall intensity of the cell. As another example, a covariance matrix may be determined to represent the absolute elevation distribution and the intensity distribution of the cell. Details regarding the hypothetic elevation and/or the determination of the first and the second characteristic value (s) may be found elsewhere in the present disclosure (e.g., FIG. 7 and the relevant descriptions thereof) .
In 530, for each cell, the processing engine 122 (e.g., the obtaining module 410) (e.g., the interface circuits of the processor 220) may obtain one or more  reference characteristic values and one or more second reference characteristic values of the cell based on a location information database.
The first reference characteristic value (s) of a cell may represent absolute elevations of a plurality of reference physical points in the cell. In some embodiments, the first reference characteristic value (s) of the cell may include a third reference characteristic value indicating an overall level of the absolute elevation of the reference physical points in the cell (also referred to as a reference overall absolute elevation of the cell) , such as a mean or median absolute elevation of the corresponding reference physical points. Additionally or alternatively, the first reference characteristic value (s) may include a fourth reference characteristic value indicating a distribution of the absolute elevations of the reference physical points in the cell (also referred to as a reference absolute elevation distribution of the cell) , such as a covariance, a variance, a standard deviation of the absolute elevations of the corresponding reference physical points.
The second reference characteristic value (s) may represent the secondary feature of the reference physical points in the cell. Taking the intensity as an exemplary secondary feature, the second reference characteristic value (s) of a cell may include a fifth reference characteristic value indicating an overall level of the intensities of the reference physical points in the cell (also referred to as a reference overall intensity of the cell) , such as a mean or median intensity of the corresponding reference physical points in the cell. Additionally or alternatively, the second reference characteristic value (s) of a cell may include a sixth reference characteristic value indicating a distribution of the intensities of the reference physical points in the cell (also referred to as a reference intensity distribution of the cell) , such as a covariance, a variance, a standard deviation of the intensities of the corresponding reference physical points.
As used herein, the location information database may refer to a database that stores location information of a region (acountry or city) covering the surrounding environment of the subject. The location information database may be a local database in the autonomous driving system 100, for example, be a portion of  the storage device 140. Alternatively, the location information database may be a remote database, such as a cloud database, which can be accessed by the processing device 122 via the network 150.
In some embodiments, the location information database may store the first and/or second reference characteristic values of the cells, which may be previously determined and stored in the location information database. The processing device 122 may directly obtain the first and/or second reference characteristic values of the cells from the location information database. For example, the location information database may include a plurality of first reference characteristic values of a plurality of reference cells in the region where the subject locates. The reference cell of the region may be similar to a cell in the surrounding environment. For each cell in the surrounding environment, the processing device 122 may determine a reference cell matching the cell (e.g., a reference cell having the same or similar position as the cell in the coordinate system) , and designate the first reference characteristic value of the matched reference cell as the first reference characteristic value of the cell.
In some embodiments, the location information database may store reference point-cloud data representative of the region. The processing device 122 may determine the first and/or second reference characteristic values of the cells based on the reference point-cloud data representative of the region. In certain embodiments, the reference point-cloud data may be stored as an HD map of the region. In certain embodiments, the reference point-cloud data may include a plurality of reference data points representative of a plurality of reference physical points in the region. As used herein, a reference physical point may refer to a physical point in the region which is detected by a sensor other than the sensor of the subject. A reference data point of a reference physical point may encode one or more feature values of one or more features of the reference physical point. Exemplary features of the reference physical point may include but are not limited to a position of the reference physical point in the coordinate system (e.g., a standard coordinate system for the Earth) , a relative elevation of the reference physical point with respect to a sensor that detects the reference physical point, one or more  secondary features (e.g., an intensity, a classification, and/or a scan direction) of the reference physical point, or the like, or any combination thereof. The position of the reference physical point in the coordinate system may include, for example, a longitude, a latitude, and/or an absolute elevation of the reference physical point in the coordinate system.
In certain embodiments, at least a portion of the reference point-cloud data may be previously acquired by a sensor mounted on a sample subject. For example, a survey vehicle (e.g., a vehicle 110) may be dispatched for a survey trip to scan the region. As the survey vehicle moves in the region, one or more sensors with high accuracy (e.g., a LiDAR device) installed in the survey vehicle may detect the reference physical points and acquire information, such as but not limited to the relative position with respect to the sensor (s) of the survey vehicle, the intensity, and the classification of each reference physical point. Additionally or alternatively, at least a portion of the reference point-cloud data may be inputted by a user or be determined based on the information acquired by the survey vehicle. For example, the absolute elevations of the reference physical points may be determined based on the information acquired by the survey vehicle and stored in the location information database. Optionally, the absolute elevations of the reference physical points may be confirmed and/or modified by a user before the storing.
In some embodiments, for each cell in the surrounding environment of the subject, the processing device 122 may obtain a reference set of reference data points corresponding to the cell from the location information database. Each reference data point in the reference set may represent a reference physical point in the cell and include an absolute elevation of the reference physical point in the coordinate system. In certain embodiments, the processing device 122 may obtain the reference point-cloud data of the region where the subject locates from the location information database, and further determine the reference sets of the cells based on the reference point-cloud data. For example, the processing device 122 may determine a group of reference physical points in the cell based on the positions of the reference physical points in the coordinate system, wherein the reference data  points representative of the group of reference physical points may be regarded as the reference set of the cell. In some other embodiments, the reference point-cloud data of the region in the location information database may be previously divided into a plurality of reference sets corresponding to the reference cells in the region, for example, by performing a similar manner with the division of the point-cloud data of the surrounding environment as described in connection with operation 620. For each cell in the surrounding environment, the processing device 122 may determine a reference cell matching the cell (e.g., a reference cell having the same or similar position as the cell in the coordinate system) , and designate the reference set of the matched reference cell as the reference set of the cell.
After the reference set of each cell is obtained or determined, the processing engine 112 may determine the first reference characteristic value (s) of each cell based on the absolute elevations of the reference physical points in the cell. The processing device 122 may further determine the second reference characteristic value (s) of the cell based on feature values of the secondary feature of the reference physical points in the cell. In some embodiments, the first and the second reference characteristic value (s) may be denoted in a similar form as the first and the second characteristic value (s) as described in connection with operation 520, for example, be denoted as one or more vectors or matrixes.
In 540, for each cell, the processing engine 122 (e.g., the absolute elevation determination module 430) (e.g., the processing circuits of the processor 220) may determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system.
In some embodiments, the updating of the hypothetic elevation may include comparing the physical points with the reference physical points in each cell based on the first characteristic value (s) , the second characteristic value (s) , the first reference characteristic value (s) , and the second reference characteristic value (s) of each cell. For example, for each cell, the processing device 122 may determine a similarity degree between the corresponding physical points and the corresponding reference physical points based on the first, second, first reference, and second  reference characteristic values of each cell. The processing device 122 may further update the hypothetic elevation based on the similarity degrees of the cells. In certain embodiments, the processing device 122 may determine the absolute elevation by performing one or more iterations as described in connection with FIG. 8.
It should be noted that the above descriptions regarding the process 500 are merely provided for the purpose of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. In some embodiments, the order in which the operations of the process 500 described above is not intended to be limiting. For example,  operations  520 and 530 may be performed simultaneously or operation 530 may be performed before operation 520.
In some embodiments, in operation 520, the processing device 122 may only determine the first characteristic value (s) of each cell to represent the absolute elevations of the physical points in each cell. In operation 530, the processing device 122 may only obtain the first reference characteristic value (s) of each cell to represent the absolute elevations of the reference physical points in each cell. In operation 540, for each cell, the processing device 122 may compare the physical points with the reference physical points in the cell based on the first characteristic value (s) and the first reference characteristic value (s) of the cell. In some embodiments, the data point of each physical point may include feature values of a plurality of secondary features of the physical point. For each secondary feature, the processing device 122 may determine one or more second characteristic values and one or more second reference characteristic values of each cell to represent the secondary feature of the cell. In operation 540, for each cell, the processing device  122 may compare the physical points with the reference physical points in the cell based on the first characteristic value (s) , the first reference characteristic value (s) , the second characteristic values, and the second reference characteristic values of the cell.
In some embodiments, the subject (e.g., the vehicle 110) may move along a road. The process 500 may be performed continuously or intermittently (e.g., periodically or irregularly) to update the absolute elevation of the subject continuously or intermittently. In some embodiments, the absolute elevation of the subject determined in the process 500 may be an absolute elevation of a specific physical point (e.g., a LiDAR device) of the subject. The processing device 122 may further determine an absolute elevation of another physical point of the subject based on the absolute elevation of the specific physical point and a relative position between the specific physical point and the another physical point.
FIG. 6 is a flowchart illustrating an exemplary process for obtaining a plurality of sets of data points corresponding to a plurality of cells in a surrounding environment of a subject according to some embodiments of the present disclosure. The process 600 may be executed by the autonomous driving system 100. For example, the process 600 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240. The processing device 122 (e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may be configured to perform the process 600. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, one or more operations of the process 600 may be performed to achieve at least part of operation 510 as described in connection with FIG. 5.
In 610, the processing engine 122 (e.g., the obtaining module 410) (e.g., the interface circuits of the processor 220) may obtain data points representative of the surrounding environment of the subject. The obtained data points may also be referred to as point-cloud data representative of the surrounding environment herein.  Each data point of the point-cloud data may represent a physical point in the surrounding environment, such as a physical point on the body surface of an object in the surrounding environment. Each data point may include at least one feature value of at least one feature of the corresponding physical point as described in connection with operation 510.
In some embodiments, the point-cloud data may be acquired by a sensor (e.g., the sensor (s) 112) assembled on the subject, such as one or more LiDAR devices as described elsewhere in the present disclosure (e.g., FIG. 1, and descriptions thereof) . For example, the sensor may emit laser pulses to the surrounding environment. The laser pulses may be reflected by the physical points in the surrounding environment and return to the sensor. The sensor may generate the point-cloud data representative of the surrounding environment based on one or more characterizes of the return laser pulses. In certain embodiments, the point-cloud data may be collected during a time period (e.g., 1 second, 2 seconds) when the subject (e.g., the vehicle 110) stops on or travels along a road. In the collection of the point-cloud data, the sensor may rotate in a scanning angle range (e.g., 360 degrees, 180 degrees, 120 degrees) and scan the surrounding environment in a certain scan frequency (e.g., 10Hz, 15Hz, 20 Hz) .
In some embodiments, the processing device 122 may obtain the point-cloud data from the sensor, a storage device (e.g., the storage device 140) in the autonomous driving system 100 that store the point-cloud data. Additionally or alternatively, the processing device 122 may obtain the point-cloud data from an external source that stores the point-cloud data via the network 150.
In 620, the processing engine 122 (e.g., the obtaining module 410) (e.g., the processing circuits of the processor 220) may divide the point-cloud data into a plurality of sets of data points corresponding to a plurality of cells of the surrounding environment.
For example, for each physical point of the surrounding environment, the processing device 122 may determine a cell that the physical point belongs. The processing device 122 may then classify the data point representative of the physical  point into the set of data points corresponding to the specific cell. For illustration purposes, the determination of a cell that a physical point A belongs is described as an example. In some embodiments, the processing device 122 may obtain or determine a reference plane (e.g., an X-Y plane defined by the standard coordinate system of Earth, a plane being parallel or roughly parallel to the ground surface on which the autonomous vehicle moves, a plane defined by a local map associated with the subject) for the surrounding environment. The reference plane may be evenly or unevenly divided into a plurality of 2D cells (each of which may have a known coordinate in the coordinate system according to certain embodiments of the present disclosure) . The 2D cells may have any suitable shapes and/or sizes. For example, the reference plane may be evenly divided into the 2D cells each has the shape of a regular triangle, a rectangle, a square, a regular hexagon, a circle, or the like. Particularly, in certain embodiments, each 2D cell may have the shape of a square, for example, a square whose side length is 10 centimeters, 20 centimeters, 30 centimeters, or the like. The processing device 122 may further project the physical point A on a corresponding 2D cell of the reference plane, wherein the corresponding 2D cell may be regarded as a cell of the surrounding environment that the physical point A belongs. In this situation, the cell may correspond to a sub-space of the surrounding environment that includes a plurality of physical points whose projection points are within the 2D cell. The physical points in the cell may refer to the physical points whose projection points are within the 2D cell.
In some embodiments, the processing device 122 may determine a coordinate of the projection point of the physical point A on the reference plane, and then determine the corresponding 2D cell of the physical point A based on the coordinate of the projection point. The coordinate of the projection point may be determined based on a positioning technique, for example, a template matching technique, a normalized cross-correlation (NCC) technique, a single shot detector (SSD) technique, a phase correlation technique, or the like, or any combination thereof. According to certain embodiments of the present disclosure, the coordinate of the projection point may be determined based on the point-cloud data  representative of the surrounding environment and a reference map (e.g., a pre-built HD map including the surrounding environment) , for example, by registering the point-cloud data and data included in the reference map. Exemplary techniques for determining the coordinate of the projection point may be found in, for example, International Applications PCT/CN2019/085637 filed on May 6, 2019, and PCT/CN2019/095816 filed on July 12, 2019, both entitled “SYSTEMS AND METHODS FOR POSITIONING” , the contents of each of which are hereby incorporated by reference.
According to some embodiments of the present disclosure, the sets of data points corresponding to the cells may further be processed in parallel, thereby improving the computational efficiency and reducing the processing time. For example, the set of data point of each cell may be analyzed to determine one or more characteristic values of the cell. The one or more characteristic values of each cell may further be compared with one or more reference characteristic values of the cell (which may be obtained from a location information database or be determined based on previously collected data representative of the cell) . The absolute elevation of the subject may be determined according to the comparison result corresponding to each cell.
It should be noted that the above description regarding the process 600 is merely provided for the purpose of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, the process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 600 described above is not intended to be limiting.
FIG. 7 is a flowchart illustrating an exemplary process for determining one or more first characteristic values of a cell according to some embodiments of the present disclosure. The process 700 may be executed by the autonomous driving  system 100. For example, the process 700 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240. The processing device 122 (e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 122 may be configured to perform the process 700. The operations of the illustrated process presented below are intended to be illustrative.
In some embodiments, one or more operations of the process 700 may be performed to achieve at least part of operation 520 as described in connection with FIG. 5. For example, the process 700 may be performed for each cell in the surrounding environment of the subject to determine the first characteristic value (s) of the cell. As described in connection with operation 520, the first characteristic value (s) of a cell may include a third characteristic value indicating an overall absolute elevation of the cell and/or a fourth characteristic value indicating an absolute elevation distribution of the cell. For illustration purposes, the following descriptions are described with reference to the determination of the third and fourth characteristic values. It should be understood that this is only one exemplary embodiment. The first characteristic value may only include one of the third and fourth characteristic values.
In 710, the processing engine 122 (e.g., the characteristic determination module 420) (e.g., the processing circuits of the process 220) may determine a preliminary third characteristic value and a preliminary fourth characteristic value of the cell based on the relative elevations of the physical points in the cell. The preliminary third characteristic value may indicate an overall relative elevation of the cell (i.e., an overall level of the relative elevations of the physical points in the cell) . The preliminary fourth characteristic value may indicate a relative elevation distribution of the cell (i.e., a distribution of the relative elevations of the physical points in the cell) .
In some embodiments, the preliminary third characteristic value may be a mean relative elevation, a median relative elevation, or any other parameter that can reflect the overall relative elevation of the cell. The preliminary fourth characteristic value may be a covariance, a variance, a standard deviation of the relative elevations of the physical points in the cell, or any other parameter that can reflect the relative elevation distribution of the cell.
In 720, the processing engine 122 (e.g., the characteristic determination module 420) (e.g., the processing circuits of the process 220) may determine the third characteristic value based on the preliminary third characteristic value and a hypothetic elevation of the subject in the coordinate system.
As used herein, the hypothetic elevation may refer to a hypothetic value of the absolute elevation of the subject in the coordinate system. In some embodiments, the hypothetic elevation may be a default setting of the autonomous driving system 100, for example, 0 meters, 10 meters, 100 meters, or the like. Alternatively, the hypothetic elevation may be inputted by a user of the autonomous driving system 100. Alternatively, the hypothetic elevation may be determined by the processing device 122.
In some embodiments, the processing device 122 may obtain an estimated location of the subject. The estimated location may refer to a geographic location where the subject locates. In certain embodiments, the geographic location may be denoted by a geographic coordinate (e.g., longitudinal and latitudinal coordinates) of the location. The estimated location may be received from a GPS device mounted on the subject and/or the navigation system 160. The processing device 122 may then determine the hypothetic elevation based on the estimated location. For example, the processing device 122 may obtain an average absolute elevation of a city or a region where the subject locates from a location information database (e.g., an HD map) , and designate the average absolute elevation as the hypothetic elevation.
As another example, the processing device 122 may obtain or determine one or more reference absolute elevations of one or more reference objects based on the  estimated location of the subject and the location information database. The reference object (s) may include any object that is located near the estimated location (e.g., within a threshold distance from the estimated location) . Exemplary reference objects may include a ground surface, a lane marking, a building, a pedestrian, an animal, or the like. The reference absolute elevation (s) of the reference object (s) may be obtained directly from the location information database. Alternatively, for a reference object, the processing device 122 may obtain absolute elevations of a plurality of physical points of the reference object from the location information database, and determine a mean or median absolute elevation of the physical points of the reference object as the reference absolute elevation of the reference object. After the reference absolute elevation (s) of reference object (s) are obtained and/or determined, the processing device 122 may determine the hypothetic elevation based on the reference absolute elevation (s) . For example, a mean value or a median value of the reference absolute elevation (s) of the reference object (s) may be determined as the hypothetic elevation.
In some embodiments, the third characteristic value representing the overall absolute elevation of the cell may be equal to a sum of the preliminary third characteristic value representing the overall relative elevation and the hypothetic elevation. Alternatively, the third characteristic value may be equal to a sum of the preliminary third characteristic value, the hypothetic elevation, and an installation height of the sensor. The installation height of the sensor may refer to a relative height of the sensor to a bottom of the subject (e.g., a chassis of a vehicle 110) .
In 730, the processing engine 122 (e.g., the characteristic determination module 420) (e.g., the processing circuits of the process 220) may designate the preliminary fourth characteristic value representing the relative elevation distribution of the cell as the fourth characteristic value representing the absolute elevation distribution of the cell.
It should be noted that the above description of the process 700 is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations  and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, the process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 700 described above is not intended to be limiting.
In some embodiments, operation 710 may be divided into a first sub-operation to determine the preliminary third characteristic value and a second sub-operation to determine the preliminary fourth characteristic value. Additionally or alternatively,  operations  720 and 730 may be performed simultaneously or operation 730 may be performed before operation 720.
In some embodiments, the processing device 122 may determine only one of the third and fourth characteristic values as the first characteristic value (s) . In some embodiments, the preliminary characteristic value (s) representing the relative elevation may be determined and/or denoted together with one or more characteristic values representing one or more other features. Merely by way of example, the third preliminary characteristic value representing the overall relative elevation may be determined and/or denoted together with the fifth characteristic value representing the overall intensity. For example, a vector P x indicating both an overall relative elevation and an overall intensity of an x th cell may be determined according to Equation (1) as below:
Figure PCTCN2019097609-appb-000001
where n refers to the total number (or count) of physical points in the x th cell, k refers to the k th physical point in the x th cell, P k refers to [Z k, I k] , wherein Z k and I k refer to the relative elevation and the intensity of the k th physical point, respectively. In some embodiments, the vector P x may be denoted as [Z x, I k] , wherein Z x refers to the mean relative elevation (which is an exemplary preliminary third characteristic value) of the x th cell, and I x refers to the mean intensity (which is an exemplary fifth characteristic value) of the x th cell. In certain embodiments, a  vector P′ x representing an overall absolute elevation and the overall intensity of the x th cell may then be determined based on the vector P x. For example, P′ x may be denoted as [Z x+Z, I x] , wherein Z refers to the hypothetic elevation of the subject.
Additionally or alternatively, the processing device 122 may determine a covariance matrix C x to indicate both the relative elevation distribution and the intensity distribution of the x th cell. The covariance matrix C x may be designated as a covariance matrix that indicate both the absolute elevation distribution and the intensity distribution of the x th cell. In certain embodiments, the covariance matrix C x may be determined according to Equation (2) as below:
Figure PCTCN2019097609-appb-000002
FIG. 8 is a schematic diagram illustrating an exemplary process for determining an absolute elevation of a subject according to some embodiments of the present disclosure. The process 800 may be executed by the autonomous driving system 100. For example, the process 800 may be implemented as a set of instructions stored in a storage device of the autonomous driving system 100, such as but not limited to the storage device 140, the ROM 230, and/or RAM 240. The processing device 122 (e.g., the processor 220 of the computing device 200, the CPU 340 of the mobile device 300, and/or the modules in FIG. 4) may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 800.
In some embodiments, one or more operations of the process 800 may be performed to achieve at least part of operation 540 as described in connection with FIG. 5. In certain embodiments, the process 800 may include one or more iterations. The hypothetic elevation of the subject and/or the first characteristic value (s) of each cell described in connection with operation 510 may be updated. For illustration purposes, a current iteration of the process 800 is described. The current iteration may include one or more of the operations as shown in FIG. 8.
In 810, for each cell, the processing engine 122 (e.g., the absolute elevation determination module 430) (e.g., the processing circuits of the processor 220) may determine a similarity degree between the physical points and the corresponding reference physical points in the cell in the current iteration. The similarity degree corresponding to a cell may be configured to measure a difference or similarity between the physical points detected by the sensor assembled on the subject and the corresponding reference physical points stored in the location information database (e.g., an HD map) .
As described in connection with FIG. 5, in some embodiments, the processing device 122 may merely determine the first characteristic value (s) and the reference first characteristic value (s) of each cell to represent the absolute elevation of the cell. In this situation, the similarity degree corresponding to a cell in the current iteration may be determined based on one or more first characteristic values in the current iteration and the first reference characteristic value (s) of the cell. In some other embodiments, the processing device 122 may further determine the second characteristic value (s) and the second reference characteristic value (s) of the cell to represent one or more secondary features of each cell. In this situation, the similarity degree corresponding to a cell in the current iteration may be determined based on one or more first characteristic value (s) in the current iteration, the first reference characteristic value (s) , as well as the second characteristic value (s) , and the second reference characteristic value (s) .
For illustration purposes, the determination of the similarity degree corresponding to a cell in the current iteration based on the absolute elevation and the intensity (which is an exemplary secondary feature) is described as an example. In some embodiments, the first characteristic value (s) in the current iteration and the second characteristic value (s) may be used to construct a feature vector representing the cell in the current iteration. The first reference characteristic value (s) and the second reference characteristic value (s) may be used to construct a reference feature vector representing the cell. The similarity degree of the cell in the current iteration may be represented by, for example, a cosine similarity, a  Euclidian distance, or any other parameter that can measure the similarity between the feature vector of the cell in the current iteration and the corresponding reference feature vector.
In some embodiments, the processing device 122 may determine a cost function for a cell to measure the difference between the physical points and the reference physical points in the cell. The processing device 122 may further determine the similarity degree of the cell based on the cost function. For example, a cost function
Figure PCTCN2019097609-appb-000003
and a similarity degree
Figure PCTCN2019097609-appb-000004
corresponding to an x th cell in the t th the iteration may be determined according to Equations (3) and (4) as below:
Figure PCTCN2019097609-appb-000005
Figure PCTCN2019097609-appb-000006
where
Figure PCTCN2019097609-appb-000007
refers to a vector representing an overall absolute elevation and an overall intensity of the x th cell in the t th iteration, P xr refers a vector torepresenting a reference overall absolute elevation and a reference overall intensity of the x th cell, C x refers to a covariance matrix representing an absolute elevation distribution and an intensity distribution of the x th cell, and C xr refers to a covariance matrix representing a reference absolute elevation distribution and a reference intensity distribution of the x th cell. In certain embodiments, the vector 
Figure PCTCN2019097609-appb-000008
may be denoted as [Z x+Z t, I x] , wherein Z x refers to a preliminary third characteristic value representing the overall relative elevation of the x th cell, I x refers to a fifth characteristic value representing the overall intensity of the x th cell, and Z t refers to a hypothetic elevation of the subject in the current iteration. Details regarding the C x, Z x, and I x may be found elsewhere in the present disclosure (e.g., Equation (1) in FIG. 7 and the relevant descriptions thereof) .
In 820, the processing engine 122 (e.g., the absolute elevation determination module 430) (e.g., the processing circuits of the processor 220) may update the hypothetic elevation in the current iteration based on the similarity degrees of the plurality of cells in the current iteration.
In some embodiments, the processing engine 122 may update the hypothetic elevation in the current iteration using on a particle filter technique. The particle filter technique may utilize a set of particles (also referred to as samples) , each of which presents a hypothetic state of a system and has a weight assigned to the particle. The weight of a particle may represent a probability that the particle is an accurate representation of an actual state of the system. The particles may be updated (e.g., resampled) iteratively according to an observation of the system until a certain condition is met. The actual state of the system may then be determined based on the updated particles after the condition is met.
In operation, before the process 800 is performed, the processing device 122 may assign a plurality of particles to the cells in the surrounding environment. Each particle may have an initial state and be assigned to one or more cells of the plurality of cells. The initial state of a particle may represent a possible value of the absolute elevation of the subject. The initial states assigned to the particles may be the same or different. In certain embodiments, the particles may be assigned with the same initial state and each particle may represent the hypothetic elevation of the subject as described in connection with operation 510 (i.e., the initial hypothetic elevation before the iteration (s) of the process 800) . Alternatively, the particles may be assigned with different initial states representing different possible values of the absolute elevation. For example, the initial sates of different particles may respectively represent an absolute elevation of a lane marking in the surrounding environment, an absolute elevation of a road sign in the surrounding environment, or the like. In certain embodiments, the particles may be distributed to the cells uniformly or nonuniformly. For example, assuming that there are X cells in the surrounding environment, each cell may be assigned with a particle and each particle may be assigned to N of the X cells.
The processing device 122 may further determine a weight of each particle in the current iteration based on the similarity degrees of the cells in the current  iteration. For example, the processing engine 122 may determine a weight
Figure PCTCN2019097609-appb-000009
of a j th particle in the t th iteration according to Equation (5) as below:
Figure PCTCN2019097609-appb-000010
where N refers to the total number (or count) of cells that the j th particle is assigned to, 
Figure PCTCN2019097609-appb-000011
refers to a similarity degree corresponding a n  toth cell that the j th particle is assigned to, X refers to the total number (or count) of the cells in the surrounding environment, 
Figure PCTCN2019097609-appb-000012
refers to a similarity degree corresponding to a x th cell in the surrounding environment.
After the weights of the particles are determined, the processing engine 122 may update the hypothetic elevation in the current iteration based on the weights and states of the particles in the current iteration. For example, the updated hypothetic elevation may be determined according to Equation (6) as below:
Figure PCTCN2019097609-appb-000013
where
Figure PCTCN2019097609-appb-000014
refers to updated hypothetic elevation, M refers to the total number (or count) of the particles, 
Figure PCTCN2019097609-appb-000015
refers to the state of the j th particle in the t th iteration.
In 830, the processing engine 122 (e.g., the absolute elevation determination module 430) (e.g., the processing circuits of the processor 220) may determine whether a termination condition is satisfied in the current iteration. An exemplary termination condition may be that the difference between the hypothetic elevation and the updated hypothetic elevation in the current iteration is within a threshold, indicating the hypothetic elevation converges. Other exemplary termination conditions may include that a certain iteration count of iterations are performed, that a difference between the particles in the current iteration and the particles in the previous iteration is within a threshold such that the particles of current iteration converges, an overall similarity degree (e.g., a mean similarity degree) corresponding to the cells exceeds a threshold, etc.
In response to a determination that the termination condition is satisfied, the process 800 may proceed to 870. In 870, the processing engine 122 (e.g., the  absolute elevation determination module 430) (e.g., the processing circuits of the processor 220) may designate the updated hypothetic elevation in the current iteration as the absolute elevation of the subject.
On the other hand, in response to a determination that the termination condition is not satisfied, the process 800 may proceed to operations 840 to 860.
In 840, for each cell, the processing engine 122 (e.g., the absolute elevation determination module 430) the processing engine 122 (e.g., the absolute elevation determination module 430) may update the first characteristic value (s) of the cell in the current iteration based on the updated hypothetic elevation. As described elsewhere in this disclosure (e.g., operation 520 and the relevant descriptions) , the first characteristic value (s) of the cell may include a third characteristic value representing an overall absolute elevation of the cell and/or a fourth characteristic value representing an absolute elevation distribution of the cell. In certain embodiments, the third characteristic value of the cell may need to be updated and the fourth characteristic value may be omitted from the updating. The updated third characteristic value may be equal to a sum of the preliminary third characteristic value representing the overall relative elevation of the cell (e.g., the mean relative elevation of the cell) and the updated hypothetic elevation in the current iteration.
In 850, the processing engine 122 (e.g., the absolute elevation determination module 430) (e.g., the processing circuits of the processor 220) may designate the updated hypothetic elevation in the current iteration as the hypothetic elevation in a next iteration. In 860, for each cell, the processing engine 122 (e.g., the absolute elevation determination module 430) the processing engine 122 (e.g., the absolute elevation determination module 430) may designate the updated first characteristic value (s) of the cell in the current iteration as the first characteristic value (s) of the cell in the next iteration. After operations 840 to 860, the process 800 may proceed to operation 810 again to perform the next iteration until the termination condition is satisfied.
In some embodiments, before the next iteration is performed, the processing device 122 may update the particles in the current iteration based on the weights of  the particles determined in the current iteration. The processing device 122 may further designate the updated particles as the particles in the next iteration. In certain embodiments, the processing device 122 may update the particles by resampling. For example, the processing device 122 may remove one or more particles if their weight (s) determined in the current iteration are smaller than a first threshold. As another example, the processing device 122 may replicate one or more particles if their weight (s) determined in the current iteration are greater than a second threshold. In certain embodiments, the processing device 122 may update the particles by updating the state (s) of one or more particles. Merely by way of example, the processing device 122 may determine an updated possible value of the absolute elevation the subject as an updated state of a certain particle.
It should be noted that the above descriptions regarding the process 800 are merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, the process 800 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. For example, the process 800 may further include an operation to store the absolute elevation and/or an operation to transmit the absolute elevation to a terminal device associated with the subject (e.g., a built-in computer of the vehicle 110) for presentation. Additionally, the order in which the operations of the process 800 described above is not intended to be limiting. Operations 840 to 860 may be performed in any order.
Having thus described the basic concepts, it may be rather apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations,  improvements, and modifications are intended to be suggested by this disclosure and are within the spirit and scope of the exemplary embodiments of this disclosure.
Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment, ” “an embodiment, ” and/or “some embodiments” mean that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
Further, it will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an  instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in a combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2103, Perl, COBOL 2102, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages. The program code may execute entirely on the user’s computer, partly on the user’s computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user’s computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
Furthermore, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations, therefore, is not intended to limit the claimed processes and methods to any order except as may be specified in the claims. Although the above disclosure discusses through various examples what is currently considered to be a variety of useful embodiments of the disclosure, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover modifications and equivalent arrangements that are within the spirit and scope of the disclosed embodiments. For example, although the implementation of various components described above may be embodied in a  hardware device, it may also be implemented as a software only solution, for example, an installation on an existing server or mobile device.
Similarly, it should be appreciated that in the foregoing description of embodiments of the present disclosure, various features are sometimes grouped in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the various inventive embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, inventive embodiments lie in less than all features of a single foregoing disclosed embodiment.
In some embodiments, the numbers expressing quantities or properties used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ” For example, “about, ” “approximate, ” or “substantially” may indicate ±20%variation of the value it describes, unless otherwise stated. Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.
Each of the patents, patent applications, publications of patent applications, and other material, such as articles, books, specifications, publications, documents, things, and/or the like, referenced herein is hereby incorporated herein by this reference in its entirety for all purposes, excepting any prosecution file history associated with same, any of same that is inconsistent with or in conflict with the present document, or any of same that may have a limiting effect as to the broadest scope of the claims now or later associated with the present document. By way of  example, should there be any inconsistency or conflict between the description, definition, and/or the use of a term associated with any of the incorporated material and that associated with the present document, the description, definition, and/or the use of the term in the present document shall prevail.
In closing, it is to be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the application. Other modifications that may be employed may be within the scope of the application. Thus, by way of example, but not of limitation, alternative configurations of the embodiments of the application may be utilized in accordance with the teachings herein. Accordingly, embodiments of the present application are not limited to that precisely as shown and described.

Claims (30)

  1. A system for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment, the system comprising:
    at least one storage medium including a set of instructions; and
    at least one processor in communication with the at least one storage medium, wherein when executing the instructions, the at least one processor is configured to direct the system to perform operations including:
    for each cell of a plurality of cells in the surrounding environment,
    obtaining a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point, wherein the at least one feature value is acquired by a sensor assembled on the subject, and the at least one feature of the physical point includes a relative elevation of the physical point with respect to the sensor;
    determining at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor, wherein the at least one first characteristic value represents absolute elevations of the corresponding physical points in the coordinate system; and
    obtaining, based on a location information database, at least one first reference characteristic value of the cell, wherein the at least one first reference characteristic value represents absolute elevations of a plurality of reference physical points in the cell in the coordinate system; and
    determining the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system, wherein the updating the hypothetic elevation of the subject includes comparing the physical points with the reference physical points in each cell based on the at least one first  characteristic value of each cell and the at least one first reference characteristic value of each cell.
  2. The system of claim 1, wherein to obtain the set of data points representative of each cell, the at least one processor is further configured to direct the system to perform additional operations including:
    obtaining the data points representative of the surrounding environment; and
    dividing the data points representative of the surrounding environment into the plurality of sets of data points corresponding to the plurality of cells of the surrounding environment.
  3. The system of claim 1 or 2, wherein the at least one feature of the physical point further includes a secondary feature of the physical point, and the at least one processor is further configured to direct the system to perform additional operations including:
    for each cell, determining at least one second characteristic value of the cell based on the feature values of the secondary feature of the corresponding physical points, wherein the at least one second characteristic value represents the secondary feature of the corresponding physical points; and
    for each cell, obtaining at least one second reference characteristic value of the cell based on the location information database, wherein the at least one second reference characteristic value represents the secondary feature of the reference physical points in the cell, and
    the comparing the physical points with the reference physical points in each cell is further based on the at least one second characteristic value of the each cell and the at least one second reference characteristic value of the each cell.
  4. The system of claim 3, wherein the secondary feature includes at least one of an intensity, a coordinate in a reference coordinate system associated with the sensor, a classification, or a scan direction.
  5. The system of claim 4, wherein the secondary feature is an intensity, and
    for each cell, the at least one second characteristic value of the cell includes a characteristic value indicating an overall intensity of the cell and a characteristic value indicating an intensity distribution of the cell.
  6. The system of any one of claims 1 to 5, wherein for each cell, the at least one first characteristic value includes a third characteristic value indicating an overall absolute elevation of the cell and a fourth characteristic value indicating an absolute elevation distribution of the cell, and to determine the at least one first characteristic value of the cell, the at least one processor is further configured to direct the system to perform additional operations including:
    determining a preliminary third characteristic value and a preliminary fourth characteristic value of the cell based on the relative elevations of the corresponding physical points, wherein the preliminary third characteristic value indicates an overall relative elevation of the cell, and the preliminary fourth characteristic value indicates a relative elevation distribution of the cell;
    determining the third characteristic value based on the preliminary third characteristic value and the hypothetic elevation; and
    designating the preliminary fourth characteristic value as the fourth characteristic value.
  7. The system of claim 6, wherein:
    the preliminary third characteristic value of the cell is a mean value of the relative elevations of the corresponding physical points, and
    the preliminary fourth characteristic value of the cell is a covariance of the relative elevations of the corresponding physical points.
  8. The system of any one of claims 1 to 7, wherein the determining the absolute elevation of the subject in the coordinate system includes one or more iterations, and  each current iteration of the one or more iterations includes:
    for each cell, determining a similarity degree between the corresponding physical points and the corresponding reference physical points in the current iteration based on the at least one first characteristic value of the cell in the current iteration and the at least one first reference characteristic value of the cell;
    updating the hypothetic elevation in the current iteration based on the similarity degrees of the plurality of cells in the current iteration;
    determining whether a termination condition is satisfied in the current iteration; and
    in response to a determination that the termination condition is satisfied, designating the updated hypothetic elevation in the current iteration as the absolute elevation.
  9. The system of claim 8, wherein each current iteration of the one or more iterations further includes:
    in response to a determination that the termination condition is not satisfied,
    updating the at least one first characteristic value of each cell in the current iteration based on the updated hypothetic elevation in the current iteration;
    designating the updated hypothetic elevation in the current iteration as the hypothetic elevation in a next iteration; and
    designating the updated first characteristic value of each cell in the current iteration as the first characteristic value of the cell in the next iteration.
  10. The system of claim 8 or 9, wherein the determining the absolute elevation of the subject in the coordinate system is based on a particle filtering technique.
  11. The system of claim 9, wherein the determining the absolute elevation of the subject in the coordinate system further includes assigning a plurality of particles to the plurality of cells, each of the plurality of particles having a state and being assigned to one or more cells of the plurality of cells, and
    the updating the hypothetic elevation in the current iteration includes:
    determining a weight for each particle of the plurality of particles in the current iteration based on the similarity degrees of the plurality of cells in the current iteration; and
    determining the updated hypothetic elevation in the current iteration based on the weights and the states of the plurality of particles in the current iteration.
  12. The system of claim 11, wherein the updating the hypothetic elevation in the current iteration further includes:
    updating the plurality of particles in the current iteration based on the weights of the plurality of particles in the current iteration; and
    designating the updated particles in the current iteration as the plurality of particles in the next iteration.
  13. The system of any one of claims 1 to 12, wherein for each cell, to obtain the at least one first reference characteristic value representing the absolute elevations of the plurality of reference physical points in the cell, the at least one processor is further configured to direct the system to perform additional operations including:
    obtaining a reference set of reference data points corresponding to the cell from the location information database, each reference data point in the reference set representing a reference physical point in the cell and including an absolute elevation of the reference physical point in the coordinate system; and
    determining the at least one first reference characteristic value of the cell based on the corresponding reference set.
  14. The system of any one of claims 1 to 13, wherein the at least one processor is further configured to direct the system to perform additional operations including:
    obtaining an estimated location of the subject;
    obtaining, based on the location information database, one or more reference absolute elevations of one or more reference objects based on the  estimated location of the subject; and
    determining the hypothetic elevation of the subject in the coordinate system based on the one or more reference absolute elevations.
  15. A method for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment, the method comprising:
    for each cell of a plurality of cells in the surrounding environment,
    obtaining a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point, wherein the at least one feature value is acquired by a sensor assembled on the subject, and the at least one feature of the physical point includes a relative elevation of the physical point with respect to the sensor;
    determining at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor, wherein the at least one first characteristic value represents absolute elevations of the corresponding physical points in the coordinate system; and
    obtaining, based on a location information database, at least one first reference characteristic value of the cell, wherein the at least one first reference characteristic value represents absolute elevations of a plurality of reference physical points in the cell; and
    determining the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system, wherein the updating the hypothetic elevation of the subject includes comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
  16. The method of claim 15, wherein the obtaining the set of data points related to each cell comprises:
    obtaining the data points representative of the surrounding environment; and
    dividing the data points representative of the surrounding environment into the plurality of sets of data points corresponding to the plurality of cells of the surrounding environment.
  17. The method of claim 15 or 16, wherein the at least one feature of the physical point further includes a secondary feature of the physical point, the method further comprises:
    for each cell, determining at least one second characteristic value of the cell based on the feature values of the secondary feature of the corresponding physical points, wherein the at least one second characteristic value represents the secondary feature of the corresponding physical points; and
    for each cell, obtaining at least one second reference characteristic value of the cell based on the location information database, wherein the at least one second reference characteristic value represents the secondary feature of the reference physical points in the cell, and
    the comparing the physical points with the reference physical points in each cell is further based on the at least one second characteristic value of the each cell and the second reference characteristic value of the each cell.
  18. The method of claim 17, wherein the secondary feature includes at least one of an intensity, a coordinate in a reference coordinate system associated with the sensor, a classification, or a scan direction.
  19. The method of claim 18, wherein the secondary feature is an intensity, and
    for each cell, the at least one second characteristic value of the cell includes a characteristic value indicating an overall intensity of the cell and a characteristic  value indicating an intensity distribution of the cell.
  20. The method of any one of claims 15 to 19, wherein for each cell, the at least one first characteristic value includes a third characteristic value indicating an overall absolute elevation of the cell and a fourth characteristic value indicating an absolute elevation distribution of the cell, and to determine the at least one first characteristic value of the cell, the method further comprises:
    determining a preliminary third characteristic value and a preliminary fourth characteristic value of the cell based on the relative elevations of the corresponding physical points, wherein the preliminary third characteristic value indicates an overall relative elevation of the cell, and the preliminary fourth characteristic value indicates a relative elevation distribution of the cell;
    determining the third characteristic value based on the preliminary third characteristic value and the hypothetic elevation; and
    designating the preliminary fourth characteristic value as the fourth characteristic value.
  21. The method of claim 20, wherein:
    the preliminary third characteristic value of the cell is a mean value of the relative elevations of the corresponding physical points, and
    the preliminary fourth characteristic value of the cell is a covariance of the relative elevations of the corresponding physical points.
  22. The method of any one of claims 15 to 21, wherein the determining the absolute elevation of the subject in the coordinate system includes one or more iterations, and each current iteration of the one or more iterations includes:
    for each cell, determining a similarity degree between the corresponding physical points and the corresponding reference physical points in the current iteration based on the at least one first characteristic value of the cell in the current iteration and the at least one first reference characteristic value of the cell;
    updating the hypothetic elevation in the current iteration based on the similarity degrees of the plurality of cells in the current iteration;
    determining whether a termination condition is satisfied in the current iteration; and
    in response to a determination that the termination condition is satisfied, designating the updated hypothetic elevation in the current iteration as the absolute elevation.
  23. The method of claim 22, wherein each current iteration of the one or more iterations further includes:
    in response to a determination that the termination condition is not satisfied,
    updating the at least one first characteristic value of the cell in the current iteration based on the updated hypothetic elevation in the current iteration;
    designating the updated hypothetic elevation in the current iteration as the hypothetic elevation in a next iteration; and
    designating the updated first characteristic value of the cell in the current iteration as the first characteristic value of the cell in the next iteration.
  24. The method of claim 22 or 23, wherein the determining the absolute elevation of the subject in the coordinate system is based on a particle filtering technique.
  25. The method of claim 23, wherein the determining the absolute elevation of the subject in the coordinate system further includes assigning a plurality of particles to the plurality of cells, each of the plurality of particles having a state and being assigned to one or more cells of the plurality of cells, and
    the updating the hypothetic elevation in the current iteration includes:
    determining a weight for each particle of the plurality of particles in the current iteration based on the similarity degrees of the plurality of cells in the current iteration; and
    determining the updated hypothetic elevation in the current iteration based  on the weights and the states of the plurality of particles in the current iteration.
  26. The method of claim 25, wherein the updating the hypothetic elevation in the current iteration further comprises:
    updating the plurality of particles in the current iteration based on the weights of the plurality of particles in the current iteration; and
    designating the updated particles in the current iteration as the plurality of particles in the next iteration.
  27. The method of any one of claims 15 to 25, wherein for each cell, the obtaining the at least one first reference characteristic value representing the absolute elevations of the plurality of reference physical points in the cell further comprises:
    obtaining a reference set of reference data points corresponding to the cell from the location information database, each reference data point in the reference set representing a reference physical point in the cell and including an absolute elevation of the reference physical point in the coordinate system; and
    determining the at least one first reference characteristic value of the cell based on the corresponding reference set.
  28. The method of any one of claims 15 to 27, wherein the method further comprises:
    obtaining an estimated location of the subject;
    obtaining, based on the location information database, one or more reference absolute elevations of one or more reference objects based on the estimated location of the subject; and
    determining the hypothetic elevation of the subject in the coordinate system based on the one or more reference absolute elevations.
  29. A non-transitory computer-readable storage medium embodying a computer program product, the computer program product comprising instructions for  determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment, wherein the instructions are configured to cause a computing device to perform operations including:
    for each cell of a plurality of cells in the surrounding environment,
    obtaining a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point, wherein the at least one feature value is acquired by a sensor assembled on the subject, and the at least one feature of the physical point includes a relative elevation of the physical point with respect to the sensor;
    determining at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor, wherein the at least one first characteristic value represents absolute elevations of the corresponding physical points in the coordinate system; and
    obtaining, based on a location information database, at least one first reference characteristic value of the cell, wherein the at least one first reference characteristic value represents absolute elevations of a plurality of reference physical points in the cell; and
    determining the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system, wherein the updating the hypothetic elevation of the subject includes comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
  30. A system for determining an absolute elevation of a subject in a coordinate system, the subject being located in a surrounding environment, the system comprising:
    an obtaining module configured to, for each cell of a plurality of cells in the surrounding environment, obtain a set of data points representative of the cell, each data point in the set representing a physical point in the cell and including at least one feature value of at least one feature of the physical point, wherein the at least one feature value is acquired by a sensor assembled on the subject, and the at least one feature of the physical point includes a relative elevation of the physical point with respect to the sensor;
    a characteristic determination module configured to, for each cell, determine at least one first characteristic value of the cell based on a hypothetic elevation of the subject in the coordinate system and the relative elevations of the corresponding physical points with respect to the sensor, wherein the at least one first characteristic value represents absolute elevations of the corresponding physical points in the coordinate system;
    the obtaining module being further configured to for each cell, obtain, based on a location information database, at least one first reference characteristic value of the cell, wherein the at least one first reference characteristic value represents absolute elevations of a plurality of reference physical points in the cell; and
    an absolute elevation determination module configured to determine the absolute elevation of the subject in the coordinate system by updating the hypothetic elevation of the subject in the coordinate system, wherein the updating the hypothetic elevation of the subject includes comparing the physical points with the reference physical points in each cell based on the at least one first characteristic value of each cell and the at least one first reference characteristic value of each cell.
PCT/CN2019/097609 2019-07-25 2019-07-25 Positioning systems and methods WO2021012243A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/097609 WO2021012243A1 (en) 2019-07-25 2019-07-25 Positioning systems and methods
CN201980045243.8A CN112384756B (en) 2019-07-25 2019-07-25 Positioning system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/097609 WO2021012243A1 (en) 2019-07-25 2019-07-25 Positioning systems and methods

Publications (1)

Publication Number Publication Date
WO2021012243A1 true WO2021012243A1 (en) 2021-01-28

Family

ID=74193092

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/097609 WO2021012243A1 (en) 2019-07-25 2019-07-25 Positioning systems and methods

Country Status (2)

Country Link
CN (1) CN112384756B (en)
WO (1) WO2021012243A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113206874A (en) * 2021-04-25 2021-08-03 腾讯科技(深圳)有限公司 Vehicle-road cooperative processing method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110199257A1 (en) * 2010-02-18 2011-08-18 David Lundgren Method and system for updating altitude information for a location by using terrain model information to prime altitude sensors
US9983002B2 (en) * 2013-06-27 2018-05-29 Google Llc Enhancing geolocation using barometric data to determine floors at a location
WO2018153811A1 (en) * 2017-02-24 2018-08-30 Here Global B.V. Precise altitude estimation for indoor positioning
CN109783593A (en) * 2018-12-27 2019-05-21 驭势科技(北京)有限公司 A kind of map updating system can be used for automatic Pilot and method

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3924560B2 (en) * 2003-12-08 2007-06-06 日立ソフトウエアエンジニアリング株式会社 Elevation data generation program and altitude data generation method
US7359038B1 (en) * 2006-06-22 2008-04-15 Donoghue Patrick J Passive determination of ground target location
KR100678397B1 (en) * 2006-06-22 2007-02-02 대원지리정보(주) System and method for save transmission and recovery of geographic information data through adaptive coordinate estimation in geographic information system
RU2008135191A (en) * 2008-08-28 2010-03-10 Федеральное государственное унитарное предприятие федеральный научно-производственный центр "Научно-исследовательский институт изме METHOD OF NAVIGATION OF MOVING OBJECTS
DE112009005295B4 (en) * 2009-10-21 2014-07-03 Mitsubishi Electric Corporation The map information processing device
CN103370601B (en) * 2011-02-18 2018-01-23 索尼移动通信株式会社 The system and method for determining height above sea level
KR101495730B1 (en) * 2014-06-27 2015-02-25 (주)올포랜드 System of updating data in numerical map with immediately correcting error
DE102014220687A1 (en) * 2014-10-13 2016-04-14 Continental Automotive Gmbh Communication device for a vehicle and method for communicating
RU2611564C1 (en) * 2016-02-11 2017-02-28 Российская Федерация, от имени которой выступает Государственная корпорация по атомной энергии "Росатом" Method of aircrafts navigation
CN108268483A (en) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 The method for the grid map that generation controls for unmanned vehicle navigation
US10070259B1 (en) * 2017-02-24 2018-09-04 Here Global B.V. Altitude map for indoor positioning services
KR102374919B1 (en) * 2017-10-16 2022-03-16 주식회사 만도모빌리티솔루션즈 Device And Method of Automatic Driving Support
CN108871353B (en) * 2018-07-02 2021-10-15 上海西井信息科技有限公司 Road network map generation method, system, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110199257A1 (en) * 2010-02-18 2011-08-18 David Lundgren Method and system for updating altitude information for a location by using terrain model information to prime altitude sensors
US9983002B2 (en) * 2013-06-27 2018-05-29 Google Llc Enhancing geolocation using barometric data to determine floors at a location
WO2018153811A1 (en) * 2017-02-24 2018-08-30 Here Global B.V. Precise altitude estimation for indoor positioning
CN109783593A (en) * 2018-12-27 2019-05-21 驭势科技(北京)有限公司 A kind of map updating system can be used for automatic Pilot and method

Also Published As

Publication number Publication date
CN112384756A (en) 2021-02-19
CN112384756B (en) 2023-11-17

Similar Documents

Publication Publication Date Title
US20220138896A1 (en) Systems and methods for positioning
US20220187843A1 (en) Systems and methods for calibrating an inertial measurement unit and a camera
US11781863B2 (en) Systems and methods for pose determination
WO2020243937A1 (en) Systems and methods for map-matching
AU2018286610A1 (en) Systems and methods for determining driving action in autonomous driving
AU2017411198B2 (en) Systems and methods for route planning
US11635304B2 (en) Map generation device, control method, program and storage medium
WO2019228520A1 (en) Systems and methods for indoor positioning
WO2021051296A1 (en) Systems and methods for calibrating a camera and a multi-line lidar
WO2021212294A1 (en) Systems and methods for determining a two-dimensional map
CN112041210B (en) System and method for autopilot
WO2020206774A1 (en) Systems and methods for positioning
WO2021012243A1 (en) Positioning systems and methods
WO2021077315A1 (en) Systems and methods for autonomous driving
WO2021012245A1 (en) Systems and methods for pose determination
US20220178701A1 (en) Systems and methods for positioning a target subject
WO2021212297A1 (en) Systems and methods for distance measurement
US11940279B2 (en) Systems and methods for positioning
WO2021051358A1 (en) Systems and methods for generating pose graph
US20220187432A1 (en) Systems and methods for calibrating a camera and a lidar
US11965744B2 (en) Systems and methods for indoor positioning
WO2021035532A1 (en) Systems and methods for positioning target subject

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19938358

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19938358

Country of ref document: EP

Kind code of ref document: A1