WO2022062019A1 - 一种地图匹配方法、装置、电子设备和存储介质 - Google Patents

一种地图匹配方法、装置、电子设备和存储介质 Download PDF

Info

Publication number
WO2022062019A1
WO2022062019A1 PCT/CN2020/123162 CN2020123162W WO2022062019A1 WO 2022062019 A1 WO2022062019 A1 WO 2022062019A1 CN 2020123162 W CN2020123162 W CN 2020123162W WO 2022062019 A1 WO2022062019 A1 WO 2022062019A1
Authority
WO
WIPO (PCT)
Prior art keywords
semantic
candidate
map
positioning information
matching
Prior art date
Application number
PCT/CN2020/123162
Other languages
English (en)
French (fr)
Inventor
何潇
Original Assignee
驭势科技(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 驭势科技(北京)有限公司 filed Critical 驭势科技(北京)有限公司
Priority to EP20954865.0A priority Critical patent/EP4206610A4/en
Priority to KR1020237009778A priority patent/KR20230086664A/ko
Priority to JP2023516484A priority patent/JP2023541167A/ja
Priority to US18/028,565 priority patent/US20230358547A1/en
Publication of WO2022062019A1 publication Critical patent/WO2022062019A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3859Differential updating map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/387Organisation of map data, e.g. version management or database structures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Definitions

  • the embodiments of the present application relate to the technical field of intelligent driving, and in particular, to a map matching method, apparatus, electronic device, and non-transitory computer-readable storage medium.
  • vehicle positioning technology occupies an important position.
  • the mainstream vehicle positioning technologies include vision-based vslam (visual simultaneous localization and mapping, visual real-time localization and map construction) technology and lidar-based lslam (laser simultaneous localization and mapping, laser real-time localization and map construction) technology, etc. These methods usually need to establish a dense positioning map in advance, and store descriptors, strengths and other information for matching, which occupies a large amount of storage resources.
  • some localization schemes are based on vector semantic maps for localization.
  • sparseness of the vector map greatly reduces its size, the lack of descriptors and other information also poses a challenge to how to achieve efficient and high-accuracy real-time matching.
  • At least one embodiment of the present application provides a map matching method, apparatus, electronic device, and non-transitory computer-readable storage medium.
  • an embodiment of the present application proposes a map matching method, which includes:
  • the local map information includes a plurality of map semantic features
  • the optimal candidate positioning information of the vehicle and the matching pair corresponding to the optimal candidate positioning information are determined.
  • an embodiment of the present application further proposes a map matching device, the device comprising:
  • an acquisition unit for acquiring initial positioning information of the vehicle and vehicle sensor data; based on the initial positioning information, acquiring local map information, where the local map information includes a plurality of map semantic features;
  • a first determining unit configured to determine a plurality of candidate positioning information based on the initial positioning information; determine a plurality of observation semantic features based on the vehicle sensor data;
  • a matching unit for positioning information for each candidate :
  • the second determining unit is configured to determine, based on the matching pair corresponding to each candidate positioning information, the optimal candidate positioning information of the vehicle and a matching pair corresponding to the optimal candidate positioning information.
  • an embodiment of the present application further provides an electronic device, including: a processor and a memory; the processor is configured to execute the map matching method according to the first aspect by calling a program or an instruction stored in the memory. step.
  • an embodiment of the present application further provides a non-transitory computer-readable storage medium for storing a program or an instruction, the program or instruction causing a computer to execute the steps of the map matching method according to the first aspect.
  • a plurality of observation semantic features are determined, and based on the initial positioning information of the vehicle, local map information (the local map information includes a plurality of map semantic features) is obtained and a plurality of Candidate positioning information (including initial positioning information); further, for each candidate positioning information, the observation semantic features and the candidate map semantic features are matched to obtain matching pairs; thus, based on the matching pairs corresponding to each candidate positioning information, determine the vehicle's location. The matching pair corresponding to the optimal candidate positioning information and the optimal candidate positioning information.
  • the embodiments of the present application are applicable to vector semantic maps, and only use the vector information of the vector semantic map (for example, information of map semantic features) to realize real-time matching between observation features and map features, without relying on additional data such as additional descriptors and intensities, On the basis of reducing storage requirements and computing power usage, a good matching effect is achieved.
  • the embodiments of the present application do not have special requirements for sensor types (cameras, lidars, etc. are all applicable).
  • FIG. 1 is an exemplary scene diagram of a matching problem provided by an embodiment of the present application
  • FIG. 2 is an exemplary architecture diagram of an intelligent driving vehicle provided by an embodiment of the present application
  • FIG. 3 is an exemplary block diagram of an intelligent driving system provided by an embodiment of the present application.
  • FIG. 4 is an exemplary block diagram of a map matching apparatus provided by an embodiment of the present application.
  • FIG. 5 is an exemplary block diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 6 is an exemplary flowchart of a map matching method provided by an embodiment of the present application.
  • FIG. 7 is an example diagram of a semantic-Euclidean distance matrix provided by an embodiment of the present application.
  • Fig. 8 is an example diagram of a distance sorting matrix determined based on the semantic-Euclidean distance matrix shown in Fig. 7;
  • FIG. 9 is an example diagram of a matrix obtained after updating the distance sorting matrix shown in FIG. 8 .
  • each matching pair includes one element in M and one element in M'.
  • the set M when applied to map matching, the set M can be understood as a set of observation features, and the set M' can be understood as a set of map features.
  • the observation feature may be a real-time observation feature, and both the observation feature and the map feature are semantic features, that is, the observation feature is the observation semantic feature, and the map feature is the map semantic feature.
  • the observation semantic feature can be understood as the observation semantic feature of the target determined based on the vehicle sensor data.
  • the vehicle sensor data is image data
  • the target included in the image can be determined by processing the image data through the target detection algorithm.
  • the category and location of the target can be understood as the observation semantic feature.
  • the image includes lane lines, traffic markings (such as going straight, turning left, turning right, etc.), traffic lights (ie, traffic lights), traffic signs, etc., which are all observed semantic features of the target used for localization.
  • Map semantic features can be understood as the semantic features of the target included in the map (such as a vector map) for localization, such as lane lines, traffic markings, traffic lights, traffic signs, etc. in the map, all of which are used for localization.
  • the map semantic features of the target in order to facilitate the acquisition of the map semantic features from the map, the map may be preprocessed, so that the map includes information of the map semantic features, such as the semantic labels and locations of the map semantic features included in the map and the map semantic features. Related information, so that when the map is acquired, the information of the semantic features of the map can be obtained from the map.
  • map matching the most matching pairs are found according to certain constraints, and then the map features corresponding to different observation features are determined, thereby providing a basis for subsequent vehicle positioning.
  • the constraints include not only the constraints of the minimum distance, but also other constraints. These constraints jointly determine the determination of the matching pair, and the content of the constraints will be described in detail below.
  • the embodiments of the present application provide a map matching solution, which determines multiple observation semantic features based on vehicle sensor data, obtains local map information (the local map information includes multiple map semantic features) based on the initial positioning information of the vehicle, and determines Multiple candidate positioning information (including initial positioning information); further, for each candidate positioning information, the observation semantic feature and the candidate map semantic feature are matched to obtain matching pairs; thus, based on the matching pairs corresponding to each candidate positioning information, determine The matching pair corresponding to the optimal candidate positioning information of the vehicle and the optimal candidate positioning information.
  • the embodiments of the present application are applicable to vector semantic maps, and only use the vector information of the vector semantic map (for example, information of map semantic features) to realize real-time matching between observation features and map features, without relying on additional data such as additional descriptors and intensities, On the basis of reducing storage requirements and computing power usage, a good matching effect is achieved.
  • the embodiments of the present application do not have special requirements for sensor types (cameras, lidars, etc. are all applicable).
  • the semantic-Euclidean distance between the map semantic feature and the observation semantic feature can be determined, and then a distance matrix can be determined to implement vector semantic map matching based on the distance matrix.
  • the distance matrix-based matching method is widely applicable to various matching scenarios where a metric distance can be defined.
  • the embodiments of the present application can be applied to intelligent driving vehicles, and can also be applied to electronic devices.
  • the intelligent driving vehicle is a vehicle equipped with different levels of intelligent driving systems.
  • the intelligent driving systems include, for example, an unmanned driving system, an assisted driving system, a driving assistance system, a highly automatic driving system, a fully automatic driving vehicle, and the like.
  • the electronic device is installed with an intelligent driving system.
  • the electronic device can be used to test the intelligent driving algorithm.
  • the electronic device can be a vehicle-mounted device.
  • the electronic device can also be applied to other fields.
  • the embodiments of the present application take an intelligent driving vehicle as an example to describe a map matching method, a map matching apparatus, an electronic device, or a non-transitory computer-readable storage medium.
  • FIG. 2 is an exemplary architectural diagram of an intelligent driving vehicle according to an embodiment of the present application.
  • an intelligent driving vehicle includes: a sensor group, an intelligent driving system 200 , a vehicle underlying execution system, and other components that can be used to drive the vehicle and control the operation of the vehicle, such as a brake pedal, a steering wheel, and an accelerator pedal.
  • the sensor group is used to collect the data of the external environment of the vehicle and detect the position data of the vehicle.
  • the sensor group includes, but is not limited to, at least one of a camera, a lidar, a millimeter-wave radar, an ultrasonic radar, a GPS (Global Positioning System, global positioning system), and an IMU (Inertial Measurement Unit, inertial measurement unit).
  • the sensor group is also used to collect dynamic data of the vehicle, for example, the sensor group also includes but is not limited to at least one of a wheel speed sensor, a speed sensor, an acceleration sensor, a steering wheel angle sensor, and a front wheel angle sensor.
  • the intelligent driving system 200 is configured to acquire sensing data of a sensor group, wherein the sensing data includes but is not limited to images, videos, laser point clouds, millimeter waves, GPS information, vehicle status, and the like.
  • the intelligent driving system 200 performs environmental perception and vehicle positioning based on the sensing data, and generates sensing information and vehicle pose; the intelligent driving system 200 performs planning and decision-making based on the sensing information and the vehicle pose, and generates Planning and decision-making information; the intelligent driving system 200 generates vehicle control instructions based on the planning and decision-making information, and sends them to the vehicle bottom execution system.
  • the intelligent driving system 200 may be a software system, a hardware system, or a system combining software and hardware.
  • the intelligent driving system 200 is a software system running on an operating system
  • the in-vehicle hardware system is a hardware system supporting the running of the operating system.
  • the intelligent driving system 200 may interact with a cloud server.
  • the intelligent driving system 200 interacts with the cloud server through a wireless communication network (for example, including but not limited to GPRS network, Zigbee network, Wifi network, 3G network, 4G network, 5G network and other wireless communication networks).
  • a wireless communication network for example, including but not limited to GPRS network, Zigbee network, Wifi network, 3G network, 4G network, 5G network and other wireless communication networks.
  • a cloud server is used to interact with the vehicle.
  • the cloud server can send the environment information, positioning information, control information and other information required in the intelligent driving process of the vehicle to the vehicle.
  • the cloud server may receive sensor data, vehicle status information, vehicle driving information and related information requested by the vehicle from the vehicle end.
  • the cloud server may remotely control the vehicle based on user settings or vehicle request.
  • the cloud server may be a server or a server group. Server farms can be centralized or distributed. In some embodiments, the cloud server may be local or remote.
  • the vehicle bottom execution system is used to receive vehicle control instructions, and control the vehicle to drive based on the vehicle control instructions.
  • the underlying vehicle execution systems include, but are not limited to, steering systems, braking systems, and drive systems.
  • the vehicle bottom-level execution system may further include a bottom-level controller, which can parse the vehicle control instructions and deliver them to corresponding systems such as a steering system, a braking system, and a driving system, respectively.
  • the intelligent driving vehicle may further include a vehicle CAN bus not shown in FIG. 1 , and the vehicle CAN bus is connected to the vehicle bottom execution system.
  • the information interaction between the intelligent driving system 200 and the vehicle bottom execution system is transmitted through the vehicle CAN bus.
  • FIG. 3 is an exemplary block diagram of an intelligent driving system 300 according to an embodiment of the present application.
  • the intelligent driving system 300 may be implemented as the intelligent driving system 200 in FIG. 2 or a part of the intelligent driving system 200 for controlling the driving of the vehicle.
  • the intelligent driving system 300 can be divided into multiple modules, for example, it may include: a perception module 301 , a planning module 302 , a control module 303 , a map matching module 304 and some other modules that can be used for intelligent driving.
  • the perception module 301 is used for environmental perception and positioning.
  • the perception module 301 is configured to acquire sensor data, V2X (Vehicle to X, vehicle wireless communication) data, high-precision maps and other data, and perform environmental perception and positioning based on at least one of the above data, and generate perception information and location information.
  • the perception information may include, but is not limited to, at least one of the following: obstacle information, road signs/marks, pedestrian/vehicle information, and drivable area.
  • the positioning information includes the vehicle pose.
  • the planning module 302 is used for path planning and decision making.
  • the planning module 302 generates planning and decision information based on the sensing information and positioning information generated by the sensing module 301 .
  • the planning module 202 may also generate planning and decision-making information in combination with at least one of V2X data, high-precision maps, and other data.
  • planning information may include but not limited to planning paths, etc.; decision information may include but not limited to at least one of the following: behavior (for example, including but not limited to following, overtaking, parking, detouring, etc.), vehicle heading, vehicle speed, Desired acceleration of the vehicle, desired steering wheel angle, etc.
  • the control module 303 is configured to generate the control instructions of the vehicle bottom execution system based on the planning and decision information, and issue the control instructions, so that the vehicle bottom execution system controls the driving of the vehicle.
  • the control instructions may include, but are not limited to, steering wheel steering, lateral control instructions, longitudinal control instructions, and the like.
  • the map matching module 304 is used to determine a plurality of observation semantic features based on the vehicle sensor data, and based on the initial positioning information of the vehicle, obtain local map information (the local map information includes a plurality of map semantic features) and determine a plurality of candidate positioning information (including Then, for each candidate positioning information, the observation semantic features and the candidate map semantic features are matched to obtain matching pairs; thus, based on the matching pairs corresponding to each candidate positioning information, the optimal candidate positioning information of the vehicle is determined The matching pair corresponding to the optimal candidate positioning information.
  • the function of the map matching module 304 can be integrated into the perception module 301, the planning module 302 or the control module 303, or can be configured as a module independent of the intelligent driving system 300, and the map matching module 304 can be a software module , hardware modules or modules that combine software and hardware.
  • the map matching module 304 is a software module running on an operating system
  • the vehicle-mounted hardware system is a hardware system that supports the running of the operating system.
  • FIG. 4 is an exemplary block diagram of a map matching apparatus 400 according to an embodiment of the present application.
  • the map matching apparatus 400 may be implemented as the map matching module 304 in FIG. 3 or a part of the map matching module 304 .
  • the map matching apparatus 400 may include, but is not limited to, the following units: an acquiring unit 401 , a first determining unit 402 , a matching unit 403 and a second determining unit 404 .
  • the obtaining unit 401 is configured to obtain relevant information of the vehicle, for example, the relevant information includes information that is directly or indirectly associated with the vehicle, such as initial positioning information and vehicle sensor data.
  • the initial positioning information can come from a priori information outside the vehicle, or can come from the estimation of the second determination unit 404; the vehicle sensor data can be obtained through data interaction with sensors installed on the vehicle.
  • the obtaining unit 404 may obtain local map information based on the initial positioning information. In some embodiments, the obtaining unit 404 may obtain local map information from a pre-established vector semantic map based on the initial positioning information, where the local map information is a part of the vector semantic map. In some embodiments, the obtaining unit 404 may obtain local map information by indexing the vector semantic map through algorithms such as fast nearest neighbors based on the initial positioning information.
  • the pre-established vector semantic map includes information of map semantic features, such as lane lines, traffic markings, traffic lights, and traffic signs in the map, all of which are map semantic features.
  • the information of the map semantic feature includes, for example, information related to the map semantic feature, such as the semantic label and location of the map semantic feature.
  • the acquiring unit 404 when acquiring the local map information, can acquire information of map semantic features from the local map information, that is, can acquire a plurality of map semantic features included in the local map information.
  • the first determining unit 402 is configured to determine, based on the initial positioning information, multiple pieces of candidate positioning information, where the multiple pieces of candidate behavior information include the initial positioning information.
  • the initial positioning information generally has a large error, and the map semantic features have strong sparseness
  • the initial positioning information is directly used for map matching, there will be a situation that the matching accuracy rate is low or the effective matching features are few. Therefore, by determining multiple candidate positioning information, the accuracy of subsequent map matching can be improved or the number of valid matching features can be increased.
  • the first determining unit 402 may perform discrete random sampling on a space within a certain range of the initial positioning information to obtain a plurality of candidate positioning information. In some embodiments, the first determining unit 402 generates n pieces of candidate positioning information including the initial positioning information according to a certain probability distribution within a certain spatial range r of the initial positioning information through Monte Carlo random sampling.
  • the spatial range r and the number n of candidate positioning information are both related to the uncertainty of the initial positioning information. The higher the uncertainty of the initial positioning information, the larger the values of r and n are generally.
  • the first determining unit 402 may determine a plurality of observation semantic features based on vehicle sensor data, where the plurality of observation semantic features are real-time observation semantic features.
  • the vehicle sensor data is image data
  • the first determining unit 402 processes the image data through a target detection algorithm, and can determine the category and position of the target included in the image, and the target can be understood as the observation semantic feature.
  • the image includes lane lines, traffic markings (such as going straight, turning left, turning right, etc.), traffic lights (ie, traffic lights), traffic signs, etc., which are all observed semantic features.
  • the vehicle sensor data can be in any form (for example, lidar data), as long as it can be obtained from all It is enough to identify the observed semantic features in the sensor data.
  • the matching unit 403 is configured to perform map matching for each candidate positioning information.
  • the matching unit 403 for each candidate positioning information : based on the candidate positioning information, converts multiple map semantic features to the coordinate system of the vehicle sensor, and obtains multiple candidate map semantic features under the coordinate system; Then, multiple observation semantic features and multiple candidate map semantic features are matched to obtain matching pairs.
  • the matching unit 403 also converts the multiple observation semantic features into the coordinate system of the same vehicle sensor, and then converts the multiple observation semantic features and the multiple candidate map semantic features into the coordinate system of the same vehicle sensor. Match to get matching pairs.
  • the matching unit 403 performs observability screening on each candidate map semantic feature, that is, determines whether the candidate map semantic feature is in the blind area of the vehicle sensor, and if so, determines that the candidate map semantic feature is not in the blind area of the vehicle sensor. It will be matched by the observed semantic features and should be filtered out and not involved in subsequent matching.
  • the matching unit 403 converts multiple map semantic features to the coordinate system of the vehicle sensor based on the candidate positioning information, and after obtaining multiple candidate map semantic features under the coordinate system, removes all the map semantic features.
  • the candidate map semantic features in the blind area of the vehicle sensor are described; then the observed semantic features and the remaining candidate map semantic features are matched to obtain matching pairs.
  • the matching pairs are specified to satisfy the following conditions 1 to 4:
  • Condition 1 There is a one-to-one correspondence between candidate map semantic features and their matching observed semantic features. That is, one candidate map semantic feature can only match one observed semantic feature, and one observed semantic feature can only match one candidate map semantic feature.
  • Condition 2 The semantic label of the candidate map semantic feature is the same as the semantic label of its matching observed semantic feature.
  • the semantic-Euclidean distance between the candidate map semantic feature and its matching observed semantic feature is less than or equal to the Euclidean distance threshold; the semantic-Euclidean distance is used to characterize the similarity between a candidate map semantic feature and an observed semantic feature ; the Euclidean distance threshold and the Euclidean distance between the candidate map semantic features and the candidate positioning information are inversely correlated.
  • the Euclidean distance between the candidate map semantic feature and the candidate positioning information can be understood as the Euclidean distance between the position (coordinate value) corresponding to the candidate map semantic feature and the candidate positioning information (coordinate value).
  • the Euclidean distance threshold is determined by:
  • th is the Euclidean distance threshold
  • th 0 is the set fixed a priori threshold
  • t is the Euclidean distance between the candidate map semantic feature and the candidate positioning information
  • f(t) is the mapping function inversely related to t.
  • the semantic-Euclidean distance between the candidate map semantic feature and its matching observed semantic feature is the smallest among all semantic-Euclidean distances corresponding to the candidate map semantic feature, and in all semantic-Euclidean distances corresponding to the observed semantic feature The smallest of the Euclidean distances.
  • a semantic-Euclidean distance is calculated between the candidate map semantic feature and each observed semantic feature.
  • a certain semantic-Euclidean distance is the smallest among multiple semantic-Euclidean distances corresponding to the candidate semantic feature, but is not the smallest among multiple semantic-Euclidean distances corresponding to the observed semantic feature, the two do not match. right.
  • the matching pair also needs to satisfy the following condition 5:
  • Condition 5 If the candidate map semantic feature has an upper-level semantic feature, the semantic label of the upper-level semantic feature of the candidate map semantic feature is the same as the semantic label of the upper-level semantic feature of the matching observation semantic feature.
  • the semantic-Euclidean distance between the upper-level semantic feature of the candidate map semantic feature and the upper-level semantic feature of the observed semantic feature matching it is less than or equal to the Euclidean distance threshold.
  • the upper-level semantic features represent the overall information of the target for localization
  • the lower-level semantic features represent parts or endpoints of the target for localization.
  • the upper-level semantic feature is the lane line
  • the lower-level semantic feature is the endpoint of the lane line.
  • the purpose of setting condition 5 is to reduce the chance of matching errors, for example, to reduce the chance of situations like the end point of one lane line matching the end point of another lane line.
  • the matching unit 403 determines the semantic-Euclidean distance between any candidate map semantic feature and any observed semantic feature in the following manner:
  • the semantic-Euclidean distance is an invalid value INF
  • the semantic-Euclidean distance is the coordinate value corresponding to the observed semantic feature and Euclidean distance between the coordinate values corresponding to the semantic feature of the candidate map;
  • the semantic-Euclidean distance is an invalid value INF.
  • the semantic-Euclidean distance is determined by:
  • d is the semantic-Euclidean distance
  • INF is an infinite value (that is, an invalid value)
  • th is the Euclidean distance threshold
  • de f(m, m')
  • f(m, m') is the Euclidean distance calculation function
  • the semantic label of the observed semantic feature is the same as the semantic label of the candidate map semantic feature, and the observed semantic feature or the candidate map semantic feature has an upper-level semantic feature, it is judged that the Whether the semantic label of the upper-level semantic feature of the observed semantic feature is the same as the semantic label of the upper-level semantic feature of the candidate map semantic feature, if they are different, determine the semantic-Euclidean distance between the observed semantic feature and the map semantic feature Invalid value INF.
  • the matching unit 403 may determine a semantic-Euclidean distance matrix composed of a plurality of observed semantic features and a plurality of candidate map semantic features.
  • m1, m2, m3, m4 and m5 are observed semantic features
  • m'1, m'2, m'3, m'4 and m'5 are candidate map semantics feature.
  • the semantic-Euclidean distance between m1 and m'1 is 0.5
  • the semantic-Euclidean distance between m1 and m'2 is INF
  • the semantic-Euclidean distance between m1 and m'3 is 0.1
  • each element in the semantic-Euclidean distance matrix shown in Figure 7 represents the semantic-Euclidean distance.
  • the matching unit 403 may determine a distance sorting matrix based on a semantic-Euclidean distance matrix; each element in the distance sorting matrix is a two-tuple, and the two-tuple represents the row and column where the semantic-Euclidean distance is located The smaller the distance, the smaller the sorting value, and the sorting value of 1 means the closest distance.
  • the distance sorting matrix shown in FIG. 8 is a distance sorting matrix determined based on the semantic-Euclidean distance matrix shown in FIG. 7 . For example, in Figure 7, since the semantic-Euclidean distance between m1 and m'1 is in the row and column order of 1 and 3, respectively, in Figure 8, the value of the corresponding elements of m1 and m'1 is (1, 3).
  • the matching unit 403 may determine the observation semantic feature and the candidate map semantic feature corresponding to the binary group (1,1) in the distance sorting matrix as a matching pair. For example, in Fig. 8, m1 and m'3 are determined as matching pairs, and m5 and m'5 are also determined as matching pairs.
  • the matching unit 403 determines the observation semantic feature and the candidate map semantic feature corresponding to the binary group (1, 1) in the distance sorting matrix as a matching pair, and then determines the binary group as (1) ,1) All elements of the corresponding row and column are modified to invalid value INF, and the distance sorting matrix is updated.
  • FIG. 9 is an example diagram of a matrix obtained after updating the distance sorting matrix shown in FIG. 8 .
  • all elements of the row and column corresponding to the binary group (1,1) in Figure 8 are modified to invalid value INF, and the element modified to INF changes the value of the element in the same row and column.
  • the matching unit 403 may determine the observation semantic feature and the map semantic feature corresponding to the binary group (1,1) in the updated distance sorting matrix as a matching pair. For example, in Fig. 9, m3 and m'2, m4 and m'4 are all determined as matching pairs.
  • the matching unit 403 may repeatedly update the distance sorting matrix and determine matching pairs, until no binary group is (1,1) in the updated distance sorting matrix.
  • any matching scenario as long as a method for measuring distance is defined, a distance matrix can be constructed, and then matching is performed to obtain a matching result. Therefore, the matching method based on the distance matrix proposed in the embodiments of the present application has wide application sex.
  • the second determining unit 404 is configured to determine, based on the matching pair corresponding to each candidate positioning information, the optimal candidate positioning information of the vehicle and the matching pair corresponding to the optimal candidate positioning information.
  • the optimal candidate positioning information can be used as the initial positioning information, so as to perform map matching on the observed semantic features acquired in real time.
  • the second determining unit 404 selects the candidate positioning information with the largest number of matching pairs as the optimal candidate positioning information for the vehicle.
  • the second determining unit 404 may determine the evaluation value of each candidate positioning information based on the prior contribution of different candidate map semantic features to the vehicle positioning and the matching pairs corresponding to each candidate positioning information.
  • the prior contribution degree can be pre-configured. For example, the semantic features of the ground, such as lane lines, etc., contribute more to vehicle positioning than non-ground features, such as road signs. Therefore, the prior contribution degree of the lane lines can be set higher than that of the road signs. the prior contribution.
  • the evaluation value of the candidate positioning information is determined by the following formula:
  • score is the evaluation value of the candidate positioning information
  • ⁇ c and ⁇ d are a priori allocation weights, which are not limited in this embodiment
  • c i is the priori of the candidate map semantic feature in the ith matching pair for vehicle positioning Contribution degree
  • d i is the semantic-Euclidean distance of the ith matching pair
  • f(d i ) is a mapping function inversely correlated with d i , that is, the smaller the d i is, the larger the f(d i ) is.
  • the second determining unit 404 may select the candidate positioning information with the highest evaluation value as the optimal candidate positioning information of the vehicle.
  • the second determining unit 404 determines whether the highest evaluation value is less than the evaluation value threshold, and if it is less than the evaluation value threshold, it is determined that the map matching fails.
  • the evaluation value threshold is a priori value, that is, a predetermined value, which is not limited in this embodiment, and can be set by those skilled in the art according to actual needs.
  • each unit in the map matching apparatus 400 is only a logical function division, and there may be other division methods in actual implementation, such as the obtaining unit 401, the first determining unit 402, the matching unit 403 and the second At least two units in the determining unit 404 may be implemented as one unit; the acquiring unit 401 , the first determining unit 402 , the matching unit 403 or the second determining unit 404 may also be divided into multiple subunits.
  • each unit or sub-unit can be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Those skilled in the art can implement the described functions using different methods for each specific application scenario.
  • FIG. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device includes: at least one processor 501 , at least one memory 502 and at least one communication interface 503 .
  • the various components in the electronic device are coupled together by a bus system 504 .
  • the communication interface 503 is used for information transmission with external devices. Understandably, the bus system 504 is used to implement connection communication between these components.
  • the bus system 504 also includes a power bus, a control bus, and a status signal bus.
  • the various buses are labeled as bus system 504 in FIG. 5 .
  • the memory 502 in this embodiment may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memory.
  • memory 502 stores the following elements, executable units or data structures, or subsets thereof, or extended sets of them: operating systems and applications.
  • the operating system includes various system programs, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic tasks and processing hardware-based tasks.
  • Applications including various applications, such as media players (Media Player), browsers (Browser), etc., are used to implement various application tasks.
  • a program for implementing the map matching method provided by the embodiments of the present application may be included in an application program.
  • the processor 501 calls the program or instruction stored in the memory 502, specifically, the program or instruction stored in the application program, and the processor 501 is configured to execute each map matching method provided by the embodiment of the present application. Example steps.
  • the map matching method provided in this embodiment of the present application may be applied to the processor 501 or implemented by the processor 501 .
  • the processor 501 may be an integrated circuit chip with signal processing capability. In the implementation process, each step of the above-mentioned method can be completed by an integrated logic circuit of hardware in the processor 501 or an instruction in the form of software.
  • the above-mentioned processor 501 can be a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a ready-made programmable gate array (Field Programmable Gate Array, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps of the map matching method provided by the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software units in the decoding processor.
  • the software unit may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory 502, and the processor 501 reads the information in the memory 502, and completes the steps of the method in combination with its hardware.
  • FIG. 6 is an exemplary flowchart of a map matching method provided by an embodiment of the present application.
  • the execution subject of the method is an electronic device, and in some embodiments, the execution subject of the method may also be an intelligent driving system supported by the in-vehicle device.
  • an electronic device is used as an execution subject to describe the flow of the map matching method.
  • step 601 the electronic device obtains initial positioning information of the vehicle and vehicle sensor data.
  • step 602 the electronic device determines a plurality of candidate positioning information based on the initial positioning information.
  • step 603 the electronic device determines a plurality of observation semantic features based on the vehicle sensor data.
  • step 604 the electronic device acquires local map information based on the initial positioning information, where the local map information includes multiple map semantic features.
  • step 605 the electronic device, for each of the candidate positioning information:
  • step 606 the electronic device determines, based on the matching pair corresponding to each candidate positioning information, the optimal candidate positioning information of the vehicle and a matching pair corresponding to the optimal candidate positioning information.
  • the method before the matching of the plurality of observation semantic features and the plurality of candidate map semantic features, the method further includes:
  • the candidate map semantic features in the blind zone of the vehicle sensor are removed.
  • the matched pair satisfies the following conditions 1 to 4:
  • Condition 1 There is a one-to-one correspondence between candidate map semantic features and their matching observed semantic features.
  • Condition 2 The semantic label of the candidate map semantic feature is the same as the semantic label of its matching observed semantic feature.
  • the semantic-Euclidean distance between the candidate map semantic feature and its matching observed semantic feature is less than or equal to the Euclidean distance threshold; the semantic-Euclidean distance is used to characterize the similarity between a candidate map semantic feature and an observed semantic feature ; the Euclidean distance threshold and the Euclidean distance between the candidate map semantic features and the candidate positioning information are inversely correlated.
  • the semantic-Euclidean distance between the candidate map semantic feature and its matching observed semantic feature is the smallest among all semantic-Euclidean distances corresponding to the candidate map semantic feature, and in all semantic-Euclidean distances corresponding to the observed semantic feature The smallest of the Euclidean distances.
  • the matching pair also satisfies the following condition 5:
  • Condition 5 If the candidate map semantic feature has an upper-level semantic feature, the semantic label of the upper-level semantic feature of the candidate map semantic feature is the same as the semantic label of the upper-level semantic feature of the matching observation semantic feature.
  • the semantic-Euclidean distance between the upper-level semantic feature of the candidate map semantic feature and the upper-level semantic feature of the observed semantic feature matching it is less than or equal to the Euclidean distance threshold.
  • the semantic-Euclidean distance between any candidate map semantic feature and any observed semantic feature is determined by:
  • the semantic-Euclidean distance is an invalid value INF
  • the semantic-Euclidean distance is the coordinate value corresponding to the observed semantic feature and Euclidean distance between the coordinate values corresponding to the semantic feature of the candidate map;
  • the semantic-Euclidean distance is an invalid value INF.
  • the Euclidean distance threshold is determined by:
  • th is the Euclidean distance threshold
  • th 0 is the set fixed a priori threshold
  • t is the Euclidean distance between the candidate map semantic feature and the candidate positioning information
  • f(t) is the mapping function inversely related to t.
  • the semantic label of the observed semantic feature is the same as the semantic label of the candidate map semantic feature, and the observed semantic feature or the candidate map semantic feature has a superior semantic feature, then determine the superior of the observed semantic feature Whether the semantic label of the semantic feature is the same as the semantic label of the upper-level semantic feature of the candidate map semantic feature, if they are different, the semantic-Euclidean distance between the observed semantic feature and the map semantic feature is determined to be an invalid value INF.
  • the matching of the plurality of observation semantic features and the plurality of candidate map semantic features to obtain a matching pair includes:
  • a distance sorting matrix is determined; each element in the distance sorting matrix is a two-tuple, and the two-tuple represents the sorting of the row and column where the semantic-euclidean distance is located;
  • the observation semantic feature and the candidate map semantic feature corresponding to the binary group (1,1) in the distance sorting matrix are determined as matching pairs.
  • the method further includes:
  • determining the optimal candidate positioning information of the vehicle and the matching pair corresponding to the optimal candidate positioning information includes:
  • the candidate positioning information with the largest number of matching pairs is selected as the optimal candidate positioning information for the vehicle.
  • determining the optimal candidate positioning information of the vehicle and the matching pair corresponding to the optimal candidate positioning information includes:
  • the candidate positioning information with the highest evaluation value is selected as the optimal candidate positioning information of the vehicle.
  • the evaluation value of the candidate positioning information is determined by the following formula:
  • score is the evaluation value of the candidate positioning information
  • ⁇ c and ⁇ d are the prior weights
  • ci is the prior contribution of the candidate map semantic features in the ith matching pair to the vehicle positioning
  • d i is the ith matching pair.
  • Semantic-Euclidean distance of matching pairs, f(d i ) is a mapping function inversely related to d i .
  • the method further includes:
  • An embodiment of the present application further proposes a non-transitory computer-readable storage medium, where the non-transitory computer-readable storage medium stores a program or an instruction, and the program or instruction causes a computer to execute the steps of each embodiment of the map matching method, In order to avoid repeated descriptions, details are not repeated here.
  • the embodiments of the present application are applicable to vector semantic maps, and only use the vector information of the vector semantic map (for example, information of map semantic features) to realize real-time matching between observation features and map features, without relying on additional data such as additional descriptors and intensities, On the basis of reducing storage requirements and computing power usage, a good matching effect is achieved.
  • the embodiments of the present application do not have special requirements for sensor types (cameras, lidars, etc. are all applicable). Has industrial applicability.

Abstract

一种地图匹配方法、装置、电子设备和存储介质。该方法包括:获取车辆的初始定位信息和车辆传感器数据(601);基于初始定位信息,确定多个候选定位信息(602);基于车辆传感器数据,确定多个观测语义特征(603);基于车辆的初始定位信息,获取局部地图信息,局部地图信息包括多个地图语义特征(604);针对每个候选定位信息(605):基于候选定位信息,将多个地图语义特征转换至车辆传感器的坐标系下,得到该坐标系下的多个候选地图语义特征(6051);将多个观测语义特征和多个候选地图语义特征进行匹配,得到匹配对(6052);基于每个候选定位信息对应的匹配对,确定车辆的最优候选定位信息和最优候选定位信息对应的匹配对(606)。该方法适用于矢量语义地图,仅利用矢量语义地图的矢量信息,实现观测特征与地图特征的实时匹配,不依赖于额外的描述子、强度等额外数据,在降低储存需求和算力使用的基础上,实现了良好的匹配效果。

Description

一种地图匹配方法、装置、电子设备和存储介质
本申请要求于2020年09月27日提交中国专利局、申请号为2020110314570、发明名称为“一种地图匹配方法、装置、电子设备和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及智能驾驶技术领域,具体涉及一种地图匹配方法、装置、电子设备和非暂态计算机可读存储介质。
背景技术
在智能驾驶技术领域中,无论是辅助驾驶还是自动驾驶,车辆定位技术占有重要地位。目前,主流的车辆定位技术包括基于视觉的vslam(visual simultaneous localization and mapping,视觉即时定位与地图构建)技术以及基于激光雷达的lslam(laser simultaneous localization and mapping,激光即时定位与地图构建)技术等,这些方法通常需要事先建立稠密的定位地图,并且储存描述子、强度等用于匹配的信息,对储存资源的占用较大。
为了降低定位地图的尺寸,同时提高某些场景下的鲁棒性,一些定位方案基于矢量语义地图进行定位。然而,尽管矢量地图的稀疏性使其尺寸大大减小,描述子等信息的缺失也对如何实现高效且高正确率的实时匹配提出了挑战。
上述对问题的发现过程的描述,仅用于辅助理解本申请的技术方案,并不代表承认上述内容是现有技术。
发明内容
为了解决现有技术存在的至少一个问题,本申请的至少一个实施例提供了一种地图匹配方法、装置、电子设备和非暂态计算机可读存储介质。
第一方面,本申请实施例提出一种地图匹配方法,所述方法包括:
获取车辆的初始定位信息和车辆传感器数据;
基于所述初始定位信息,确定多个候选定位信息;
基于所述车辆传感器数据,确定多个观测语义特征;
基于所述初始定位信息,获取局部地图信息,所述局部地图信息包括多个地图语义特征;
针对每个所述候选定位信息:
基于该候选定位信息,将所述多个地图语义特征转换至所述车辆传感器的坐标系下,得到该坐标系下的多个候选地图语义特征;
将所述多个观测语义特征和所述多个候选地图语义特征进行匹配,得到匹配对;
基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对。
第二方面,本申请实施例还提出一种地图匹配装置,所述装置包括:
获取单元,用于获取车辆的初始定位信息和车辆传感器数据;基于所述初始定位信息,获取局部地图信息,所述局部地图信息包括多个地图语义特征;
第一确定单元,用于基于所述初始定位信息,确定多个候选定位信息;基于所述车辆传感器数据,确定多个观测语义特征;
匹配单元,用于针对每个所述候选定位信息:
基于该候选定位信息,将所述多个地图语义特征转换至所述车辆传感器的坐标系下,得到该坐标系下的多个候选地图语义特征;
将所述多个观测语义特征和所述多个候选地图语义特征进行匹配,得到匹配对;
第二确定单元,用于基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对。
第三方面,本申请实施例还提出一种电子设备,包括:处理器和存储器;所述处理器通过调用所述存储器存储的程序或指令,用于执行如第一方面所述地图匹配方法的步骤。
第四方面,本申请实施例还提出一种非暂态计算机可读存储介质,用于存储程序或指令,所述程序或指令使计算机执行如第一方面所述地图匹配方法的步骤。
可见,本申请的至少一个实施例中,基于车辆传感器数据,确定多个观测语义特征,并基于车辆的初始定位信息,获取局部地图信息(局部地图信息包括多个地图语义特征)以及确定多个候选定位信息(包括初始定位信息);进而针对每个候选定位信息,均进行观测语义特征和候选地图语义特征的匹配,得到匹配对;从而基于每个候选定位信息对应的匹配对,确定车辆的最优候选定位信息和最优候选定位信息对应的匹配对。
本申请实施例适用于矢量语义地图,仅利用矢量语义地图的矢量信息(例如地图语义特征的信息),实现观测特征与地图特征的实时匹配,不依赖于额外的描述子、强度等额外数据,在降低储存需求和算力使用的基础上,实现良好的匹配效果。另外,本申请实施例对传感器类型没有特殊要求(摄像头、激光雷达等均可以适用)。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术 人员来讲,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种匹配问题的示例性场景图;
图2是本申请实施例提供的一种智能驾驶车辆的示例性架构图;
图3是本申请实施例提供的一种智能驾驶系统的示例性框图;
图4是本申请实施例提供的一种地图匹配装置的示例性框图;
图5是本申请实施例提供的一种电子设备的示例性框图;
图6是本申请实施例提供的一种地图匹配方法的示例性流程图;
图7是本申请实施例提供的一种语义-欧式距离矩阵的示例图;
图8是基于图7所示的语义-欧式距离矩阵确定的距离排序矩阵示例图;
图9是更新图8所示的距离排序矩阵后得到的矩阵示例图。
具体实施方式
为了能够更清楚地理解本申请的上述目的、特征和优点,下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,所描述的实施例是本申请的一部分实施例,而不是全部的实施例。此处所描述的具体实施例仅仅用于解释本申请,而非对本申请的限定。基于所描述的本申请的实施例,本领域普通技术人员所获得的所有其他实施例,都属于本申请保护的范围。
需要说明的是,在本文中,诸如“第一”和“第二”等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。
为了在智能驾驶过程中实时进行地图匹配,本申请实施例结合图1,对匹配问题进行如下描述:
图1中,给定集合M{m1,m2}与集合M’{m’1,m’2},在最小距离的约束下,为M中的元素在M’中找到尽可能多的匹配元素,形成匹配对,每个匹配对中包括M中的一个元素和M’中的一个元素。
图1中,存在两个匹配对:(m1,m’2)与(m2,m’1),可以理解,由于在最小距离的约束下,因此,m1和m’2之间的距离小于m1和m’1之间的距离,同样的,m2和m’1之间的距离小于m2和m’2之间的距离。
基于匹配问题的如上描述,应用到地图匹配中,集合M可以理解为观测特征的集合,集合M’可以理解为地图特征的集合。在一些实施例中,观测特征可以为实时观测特征,不论是观测特征还是地图特征均为语义特征,也即观测特征为观测语义特征,地图特征为地图语义特征。
观测语义特征可以理解为基于车辆传感器数据,确定的用于定位的目标的观测语义特征,例如,车辆传感器数据为图像数据,通过目标检测算法对图像数据进行处理,可以确定图像中所包括的目标的类别及位置,所述目标即可理解为观测语义特征。举例说明,图像中包括车道线、交通标线(例如 直行、左转、右转等)、交通信号灯(即红绿灯)、交通指示牌等,均为用于定位的目标的观测语义特征。
地图语义特征可以理解为地图(例如矢量地图)中所包括的用于定位的目标的语义特征,例如地图中的车道线、交通标线、交通信号灯、交通指示牌等,均为用于定位的目标的地图语义特征。在一些实施例中,为了便于从地图中获取地图语义特征,可对地图进行预处理,使得地图中包括地图语义特征的信息,例如地图中包括地图语义特征的语义标签和位置等与地图语义特征相关的信息,以便在获取到地图的同时,就可从地图中得到地图语义特征的信息。
在地图匹配中,按照一定约束条件找到最多的匹配对,进而确定不同的观测特征各自对应的地图特征,从而为后续车辆定位提供依据。需要说明的是,在地图匹配中,约束条件不仅包括最小距离的约束,还可包括其他的约束条件,这些约束条件共同决定匹配对的确定,约束条件包括的内容将在下文中详细说明。
因此,本申请实施例提供一种地图匹配方案,基于车辆传感器数据,确定多个观测语义特征,并基于车辆的初始定位信息,获取局部地图信息(局部地图信息包括多个地图语义特征)以及确定多个候选定位信息(包括初始定位信息);进而针对每个候选定位信息,均进行观测语义特征和候选地图语义特征的匹配,得到匹配对;从而基于每个候选定位信息对应的匹配对,确定车辆的最优候选定位信息和最优候选定位信息对应的匹配对。
本申请实施例适用于矢量语义地图,仅利用矢量语义地图的矢量信息(例如地图语义特征的信息),实现观测特征与地图特征的实时匹配,不依赖于额外的描述子、强度等额外数据,在降低储存需求和算力使用的基础上,实现良好的匹配效果。另外,本申请实施例对传感器类型没有特殊要求(摄像头、激光雷达等均可以适用)。
在一些实施例中,可确定地图语义特征和观测语义特征之间的语义-欧式距离,进而可确定距离矩阵,从而基于距离矩阵实现矢量语义地图匹配。在一些实施例中,基于距离矩阵的匹配方法广泛适用于各种可以定义度量距离的匹配场景。
本申请实施例可以应用于智能驾驶车辆,还可以应用于电子设备。所述智能驾驶车辆为搭载不同等级智能驾驶系统的车辆,智能驾驶系统例如包括:无人驾驶系统、辅助驾驶系统、驾驶辅助系统、高度自动驾驶系统、完全自动驾驶车辆等等。所述电子设备安装有智能驾驶系统,例如电子设备可用于测试智能驾驶算法,又例如电子设备可以为车载设备,在一些实施例中,电子设备还可以应用到其他领域。应当理解的是,本申请实施例的应用场景仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以将本申请应用于其他类似情景。以下为了能够更清楚无误的阐述,本申请实施例以智能驾驶车辆为例对地图匹配方法、地图匹配装置、电子设备或非暂态计算机可读存储介质进行说明。
图2为本申请实施例提供的一种智能驾驶车辆的示例性架构图。如图2所示,智能驾驶车辆包括:传感器组、智能驾驶系统200、车辆底层执行系统以及其他可用于驱动车辆和控制车辆运行的部件,例如制动踏板、方向盘和油门踏板。
传感器组,用于采集车辆外界环境的数据和探测车辆的位置数据。传感器组例如包括但不限于摄像头、激光雷达、毫米波雷达、超声波雷达、GPS(Global Positioning System,全球定位系统)和IMU(Inertial Measurement Unit,惯性测量单元)中的至少一个。
在一些实施例中,传感器组,还用于采集车辆的动力学数据,传感器组例如还包括但不限于车轮转速传感器、速度传感器、加速度传感器、方向盘转角传感器、前轮转角传感器中的至少一个。
智能驾驶系统200,用于获取传感器组的传感数据,其中,所述传感数据包括但不限于图像、视频、激光点云、毫米波、GPS信息、车辆状态等。在一些实施例中,智能驾驶系统200基于所述传感数据进行环境感知和车辆定位,生成感知信息和车辆位姿;智能驾驶系统200基于所述感知信息和车辆位姿进行规划和决策,生成规划和决策信息;智能驾驶系统200基于规划和决策信息生成车辆控制指令,并下发给车辆底层执行系统。
在一些实施例中,智能驾驶系统200可以为软件系统、硬件系统或者软硬件结合的系统。例如,智能驾驶系统200是运行在操作系统上的软件系统,车载硬件系统是支持操作系统运行的硬件系统。
在一些实施例中,智能驾驶系统200可以与云端服务器进行交互。在一些实施例中,智能驾驶系统200与云端服务器通过无线通讯网络(例如包括但不限于GPRS网络、Zigbee网络、Wifi网络、3G网络、4G网络、5G网络等无线通讯网络)进行交互。
在一些实施例中,云端服务器用于与车辆进行交互。其中,所述云端服务器可以向车辆发送环境信息、定位信息、控制信息及车辆智能驾驶过程中需要的其他信息。在一些实施例中,所述云端服务器可以接收来自车端的传感数据、车辆状态信息、车辆行驶信息以及车辆请求的相关信息。在一些实施例中,云端服务器可以基于用户设置或车辆请求对所述车辆进行远程控制。在一些实施例中,云端服务器可以是一个服务器,也可以是一个服务器群组。服务器群组可以是集中式的,也可以是分布式的。在一些实施例中,云端服务器可以是本地的或远程的。
车辆底层执行系统,用于接收车辆控制指令,并基于所述车辆控制指令控制车辆行驶。在一些实施例中,车辆底层执行系统包括但不限于:转向系统、制动系统和驱动系统。在一些实施例中,所述车辆底层执行系统还可包括底层控制器,用于可以解析车辆控制指令,并将其分别下发至转向系统、制动系统和驱动系统等对应系统。
在一些实施例中,智能驾驶车辆还可包括图1中未示出的车辆CAN总线,车辆CAN总线连接车辆底层执行系统。智能驾驶系统200与车辆底层执行系统之间的信息交互通过车辆CAN总线进行传递。
图3为本申请实施例提供的一种智能驾驶系统300的示例性框图。在一些实施例中,智能驾驶系统300可以实现为图2中的智能驾驶系统200或者智能驾驶系统200的一部分,用于控制车辆行驶。
如图3所示,智能驾驶系统300可划分为多个模块,例如可包括:感知模块301、规划模块302、控制模块303、地图匹配模块304以及其他一些可用于智能驾驶的模块。
感知模块301用于进行环境感知与定位。在一些实施例中,感知模块301用于获取传感器数据、V2X(Vehicle to X,车用无线通信)数据、高精度地图等数据并基于以上至少一种数据进行环境感知与定位,生成感知信息和定位信息。其中,感知信息可包括但不限于以下至少一个:障碍物信息、道路标志/标记、行人/车辆信息、可行驶区域。定位信息包括车辆位姿。
规划模块302用于进行路径规划和决策。在一些实施例中,规划模块302基于感知模块301生成的感知信息和定位信息,生成规划和决策信息。在一些实施例中,规划模块202还可以结合V2X数据、高精度地图等数据中的至少一种,生成规划和决策信息。其中,规划信息可包括但不限于规划路径等;决策信息可包括但不限于以下至少一种:行为(例如包括但不限于跟车、超车、停车、绕行等)、车辆航向、车辆速度、车辆的期望加速度、期望的方向盘转角等。
控制模块303用于基于规划和决策信息生成车辆底层执行系统的控制指令,并下发控制指令,以使车辆底层执行系统控制车辆行驶。其中,控制指令可包括但不限于:方向盘转向、横向控制指令、纵向控制指令等。
地图匹配模块304用于基于车辆传感器数据,确定多个观测语义特征,并基于车辆的初始定位信息,获取局部地图信息(局部地图信息包括多个地图语义特征)以及确定多个候选定位信息(包括初始定位信息);进而针对每个候选定位信息,均进行观测语义特征和候选地图语义特征的匹配,得到匹配对;从而基于每个候选定位信息对应的匹配对,确定车辆的最优候选定位信息和最优候选定位信息对应的匹配对。
在一些实施例中,地图匹配模块304的功能可集成到感知模块301、规划模块302或控制模块303中,也可配置为与智能驾驶系统300相独立的模块,地图匹配模块304可以为软件模块、硬件模块或者软硬件结合的模块。例如,地图匹配模块304是运行在操作系统上的软件模块,车载硬件系统是支持操作系统运行的硬件系统。
图4为本申请实施例提供的一种地图匹配装置400的示例性框图。在一些实施例中,地图匹配装置400可以实现为图3中的地图匹配模块304或者地图匹配模块304的一部分。
如图4所示,地图匹配装置400可包括但不限于以下单元:获取单元401、第一确定单元402、匹配单元403和第二确定单元404。
获取单元401
获取单元401,用于获取车辆的相关信息,相关信息例如包括初始定位信息和车辆传感器数据等 与车辆有直接或间接关联关系的信息。其中,初始定位信息可以来自车辆外部的先验信息,也可以来自第二确定单元404的估计;车辆传感器数据可通过与安装在车辆上的传感器进行数据交互来获取。
在一些实施例中,获取单元404可基于初始定位信息,获取局部地图信息。在一些实施例中,获取单元404可基于初始定位信息,从预先建立的矢量语义地图中获取局部地图信息,局部地图信息是矢量语义地图的一部分。在一些实施例中,获取单元404可基于初始定位信息,通过快速最近邻等算法对矢量语义地图进行索引获取局部地图信息。
预先建立的矢量语义地图中包括地图语义特征的信息,例如地图中的车道线、交通标线、交通信号灯、交通指示牌等,均为地图语义特征。地图语义特征的信息例如包括地图语义特征的语义标签和位置等与地图语义特征相关的信息。
本实施例中,获取单元404在获取到局部地图信息的同时,就可从局部地图信息中得到地图语义特征的信息,也即,可以得到局部地图信息所包括的多个地图语义特征。
第一确定单元402
第一确定单元402,用于基于初始定位信息,确定多个候选定位信息,这多个候选行为信息中包括初始定位信息。本实施例中,考虑初始定位信息一般存在较大误差,同时地图语义特征又具有较强的稀疏性,如果直接使用初始定位信息进行地图匹配会存在匹配正确率低或者有效匹配特征少的情况,因此,通过确定多个候选定位信息,可提高后续地图匹配的正确率或者增加有效匹配特征的数量。
在一些实施例中,第一确定单元402可对初始定位信息一定范围内的空间进行离散随机采样,得到多个候选定位信息。在一些实施例中,第一确定单元402通过蒙特卡洛随机采样,在初始定位信息一定空间范围r内按一定概率分布生成包括初始定位信息在内的n个候选定位信息。其中,空间范围r和候选定位信息的数量n均与初始定位信息的不确定度相关,初始定位信息的不确定度越高,r和n的取值一般也越大。
在一些实施例中,第一确定单元402可基于车辆传感器数据,确定多个观测语义特征,这多个观测语义特征为实时观测语义特征。例如,车辆传感器数据为图像数据,第一确定单元402通过目标检测算法对图像数据进行处理,可以确定图像中所包括的目标的类别及位置,所述目标即可理解为观测语义特征。举例说明,图像中包括车道线、交通标线(例如直行、左转、右转等)、交通信号灯(即红绿灯)、交通指示牌等,均为观测语义特征。值得说明的是,上述对传感器数据的举例仅用于说明,并不用于限定本申请,在实际应用中,所述车辆传感器数据可以是任意形式的(例如,激光雷达数据),只要能够从所述传感器数据中识别出观测语义特征即可。
匹配单元403
匹配单元403,用于针对每个候选定位信息,均进行地图匹配。在一些实施例中,匹配单元403针对每个候选定位信息:基于该候选定位信息,将多个地图语义特征转换至车辆传感器的坐标系下, 得到该坐标系下的多个候选地图语义特征;进而将多个观测语义特征和多个候选地图语义特征进行匹配,得到匹配对。在一些实施例中,匹配单元403将多个观测语义特征也转换至同一车辆传感器的坐标系下,进而将转换至同一车辆传感器的坐标系下的多个观测语义特征和多个候选地图语义特征进行匹配,得到匹配对。
在一些实施例中,匹配单元403对每个候选地图语义特征进行可观测性筛选,也即判断候选地图语义特征是否处于所述车辆传感器的盲区内,若处于,则判定该候选地图语义特征不会被观测语义特征匹配到,应当被滤除,不参与后续的匹配。
具体地,匹配单元403针对每个候选定位信息:基于该候选定位信息,将多个地图语义特征转换至车辆传感器的坐标系下,得到该坐标系下的多个候选地图语义特征后,去除所述车辆传感器的盲区内的候选地图语义特征;进而将观测语义特征和剩余的候选地图语义特征进行匹配,得到匹配对。
在一些实施例中,为了提高地图匹配的正确率或者增加有效匹配特征的数量,本实施例中明确匹配对满足如下条件1至条件4:
条件1:候选地图语义特征和与其匹配的观测语义特征一一对应。也即,一个候选地图语义特征只能匹配一个观测语义特征,一个观测语义特征只能匹配一个候选地图语义特征。
条件2:候选地图语义特征的语义标签和与其匹配的观测语义特征的语义标签相同。
条件3:候选地图语义特征和与其匹配的观测语义特征之间的语义-欧式距离小于或等于欧式距离阈值;所述语义-欧式距离用于表征一个候选地图语义特征和一个观测语义特征的相近程度;所述欧式距离阈值和所述候选地图语义特征与候选定位信息之间的欧式距离反相关。其中,候选地图语义特征与候选定位信息之间的欧式距离可以理解为候选地图语义特征对应的位置(坐标值)与候选定位信息(坐标值)之间的欧式距离。
在一些实施例中,欧式距离阈值通过下式确定:
th=th 0×f(t)
其中,th为欧式距离阈值,th 0为设置的固定先验阈值,t为候选地图语义特征与候选定位信息之间的欧式距离,f(t)为与t反相关的映射函数。候选地图语义特征与候选定位信息之间的欧式距离t越大(即候选地图语义特征距离候选定位信息越远),更容易造成匹配错误,因此,欧式距离阈值th越小,以提高正确匹配的几率。
条件4:候选地图语义特征和与其匹配的观测语义特征之间的语义-欧式距离在所述候选地图语义特征对应的所有语义-欧式距离中最小,且在所述观测语义特征对应的所有语义-欧式距离中最小。在一些实施例中,该候选地图语义特征与每个观测语义特征计算一个语义-欧式距离。在一些实施例中,某个语义-欧式距离在该候选语义特征对应的多个语义-欧式距离中最小,但在观测语义特征对应的多 个语义-欧式距离中不是最小,则二者不是匹配对。
在一些实施例中,匹配对还需满足如下条件5:
条件5:若候选地图语义特征具有上级语义特征,则所述候选地图语义特征的上级语义特征的语义标签和与其匹配的观测语义特征的上级语义特征的语义标签相同。在一些实施例中,所述候选地图语义特征的上级语义特征和与其匹配的观测语义特征的上级语义特征之间的语义-欧式距离小于或等于所述欧式距离阈值。
在一些实施例中,上级语义特征表征用于定位的目标的整体信息,下级语义特征表征用于定位的目标的局部或端点。例如,上级语义特征为车道线,下级语义特征为车道线的端点。
设置条件5的目的在于减少匹配错误的几率,例如,减少类似某一条车道线的端点匹配到了另一条车道线端点这种情况的几率。
需要说明的是,目前的一些匹配算法,如最近邻匹配或者暴力匹配,都不能满足如上条件1至条件5。
在一些实施例中,匹配单元403对任一候选地图语义特征与任一观测语义特征的语义-欧式距离通过以下方式确定:
若该观测语义特征的语义标签与该候选地图语义特征的语义标签不同,则所述语义-欧式距离为无效值INF;
若该观测语义特征的语义标签与该候选地图语义特征的语义标签相同:
确定该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离和欧式距离阈值;
若该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离小于或等于所述欧式距离阈值,则所述语义-欧式距离为该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离;
若该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离大于所述欧式距离阈值,则所述语义-欧式距离为无效值INF。
在一些实施例中,语义-欧式距离通过下式确定:
Figure PCTCN2020123162-appb-000001
其中,d为语义-欧式距离;INF为无限大的值(即无效值);th为欧式距离阈值;de=f(m,m’),f(m,m’)为欧式距离计算函数,其具体形式与m,m’的几何形态有关,例如,点与点的距离度量, 线与线的距离度量有所差异,但都能获得一种欧拉距离值;m为观测语义特征,m’为候选地图语义特征,label为观测语义特征的语义标签,label’为候选地图语义特征的语义标签。
在一些实施例中,为了满足条件5,若观测语义特征的语义标签与候选地图语义特征的语义标签相同,且所述观测语义特征或所述候选地图语义特征具有上级语义特征,则判断所述观测语义特征的上级语义特征的语义标签与所述候选地图语义特征的上级语义特征的语义标签是否相同,若不同,则确定所述观测语义特征与所述地图语义特征之间的语义-欧式距离为无效值INF。
在一些实施例中,匹配单元403可确定多个观测语义特征和多个候选地图语义特征构成的语义-欧式距离矩阵。如图7所示的语义-欧式距离矩阵中,m1、m2、m3、m4和m5为观测语义特征,m’1、m’2、m’3、m’4和m’5为候选地图语义特征。图7中,m1与m’1之间的语义-欧式距离为0.5,m1与m’2之间的语义-欧式距离为INF;m1与m’3之间的语义-欧式距离为0.1,可见,图7所示的语义-欧式距离矩阵中每个元素表示语义-欧式距离。在一些实施例中,图7所示的语义-欧式距离矩阵中存在无效值INF,因此,语义-欧式距离矩阵为稀疏矩阵,不需要存储无效值INF,仅存储有效值,进而提升后续的匹配效率。
在一些实施例中,匹配单元403可基于语义-欧式距离矩阵,确定距离排序矩阵;所述距离排序矩阵中每个元素为二元组,所述二元组表示语义-欧式距离所在行和列的排序,距离越小,排序值越小,排序值为1表示距离最近。如图8所示的距离排序矩阵为基于图7所示的语义-欧式距离矩阵确定的距离排序矩阵。例如,在图7中,由于m1与m’1之间的语义-欧式距离所在行和列的排序分别为1和3,因此,在图8中,m1与m’1对应元素的取值为(1,3)。
在一些实施例中,匹配单元403可将距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对。例如,在图8中,m1与m’3确定为匹配对,m5与m’5也确定为匹配对。
在一些实施例中,匹配单元403将距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对后,将所述二元组为(1,1)所对应的行和列的所有元素均修改为无效值INF,并更新距离排序矩阵。
图9是更新图8所示的距离排序矩阵后得到的矩阵示例图。为满足条件1,将图8中二元组为(1,1)所对应的行和列的所有元素均修改为无效值INF,修改为INF的元素使得与其相同行列的元素的取值改变,形成新的如图9所示的矩阵。需要说明的是,修改为无效值INF仅仅是更新距离排序矩阵的中间操作,不代表语义-欧式距离为无效值INF。
在一些实施例中,匹配单元403可将更新后的距离排序矩阵中二元组为(1,1)所对应的观测语义特征和地图语义特征,确定为匹配对。例如在图9中,m3与m’2、m4与m’4均确定为匹配对。
匹配单元403可重复更新距离排序矩阵并确定匹配对,直至更新后的距离排序矩阵中没有二元组 为(1,1)。
在一些实施例中,在任意匹配场景中,只要定义了度量距离的方式,就可以构建距离矩阵,进而进行匹配获得匹配结果,因此本申请实施例提出的基于距离矩阵的匹配方式具有广泛的应用性。
第二确定单元404
第二确定单元404,用于基于每个候选定位信息对应的匹配对,确定车辆的最优候选定位信息和最优候选定位信息对应的匹配对。在一些实施例中,最优候选定位信息可以作为初始定位信息,以便对实时获取的观测语义特征进行地图匹配。
在一些实施例中,第二确定单元404选取匹配对数量最多的候选定位信息为车辆的最优候选定位信息。
在一些实施例中,第二确定单元404可基于不同候选地图语义特征对于车辆定位的先验贡献度和每个候选定位信息对应的匹配对,确定每个候选定位信息的评价值。其中,先验贡献度可预先配置,例如,地面的语义特征如车道线等,对车辆定位提供的贡献大于非地面特征,如路牌等,因此,可设置车道线的先验贡献度高于路牌的先验贡献度。
在一些实施例中,候选定位信息的评价值通过下式确定:
score=λ c∑c id∑f(d i)
其中,score为候选定位信息的评价值;λ c和λ d为先验分配权重,本实施例不限定具体取值;c i为第i个匹配对中候选地图语义特征对于车辆定位的先验贡献度;d i为第i个匹配对的语义-欧式距离;f(d i)为与d i反相关的映射函数,即d i越小,f(d i)越大。
在一些实施例中,第二确定单元404可选取评价值最高的候选定位信息为车辆的最优候选定位信息。
在一些实施例中,第二确定单元404选取评价值最高的候选定位信息后,判断最高的评价值是否小于评价值阈值,若小于,则确定地图匹配失败。其中,评价值阈值为先验值,也即预先确定的值,本实施例不限定其具体取值,本领域技术人员可根据实际需要进行设置。
在一些实施例中,地图匹配装置400中各单元的划分仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如获取单元401、第一确定单元402、匹配单元403和第二确定单元404中的至少两个单元可以实现为一个单元;获取单元401、第一确定单元402、匹配单元403或第二确定单元404也可以划分为多个子单元。可以理解的是,各个单元或子单元能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以对每个特定的应用场景来使用不同方法来实现所描述的功能。
图5是本申请实施例提供的一种电子设备的结构示意图。
如图5所示,电子设备包括:至少一个处理器501、至少一个存储器502和至少一个通信接口503。电子设备中的各个组件通过总线系统504耦合在一起。通信接口503,用于与外部设备之间的信息传输。可理解地,总线系统504用于实现这些组件之间的连接通信。总线系统504除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但为了清楚说明起见,在图5中将各种总线都标为总线系统504。
可以理解,本实施例中的存储器502可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。
在一些实施方式中,存储器502存储了如下的元素,可执行单元或者数据结构,或者他们的子集,或者他们的扩展集:操作系统和应用程序。
其中,操作系统,包含各种系统程序,例如框架层、核心库层、驱动层等,用于实现各种基础任务以及处理基于硬件的任务。应用程序,包含各种应用程序,例如媒体播放器(Media Player)、浏览器(Browser)等,用于实现各种应用任务。实现本申请实施例提供的地图匹配方法的程序可以包含在应用程序中。
在本申请实施例中,处理器501通过调用存储器502存储的程序或指令,具体的,可以是应用程序中存储的程序或指令,处理器501用于执行本申请实施例提供的地图匹配方法各实施例的步骤。
本申请实施例提供的地图匹配方法可以应用于处理器501中,或者由处理器501实现。处理器501可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器501中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器501可以是通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
本申请实施例提供的地图匹配方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件单元组合执行完成。软件单元可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器502,处理器501读取存储器502中的信息,结合其硬件完成方法的步骤。
图6为本申请实施例提供的一种地图匹配方法的示例性流程图。该方法的执行主体为电子设备,在一些实施例中,该方法的执行主体还可以为车载设备所支持的智能驾驶系统。为便于描述,以下实施例中以电子设备为执行主体说明地图匹配方法的流程。
如图6所示,在步骤601中,电子设备获取车辆的初始定位信息和车辆传感器数据。
在步骤602中,电子设备基于所述初始定位信息,确定多个候选定位信息。
在步骤603中,电子设备基于所述车辆传感器数据,确定多个观测语义特征。
在步骤604中,电子设备基于所述初始定位信息,获取局部地图信息,所述局部地图信息包括多个地图语义特征。
在步骤605中,电子设备针对每个所述候选定位信息:
6051、基于该候选定位信息,将所述多个地图语义特征转换至所述车辆传感器的坐标系下,得到该坐标系下的多个候选地图语义特征;
6052、将所述多个观测语义特征和所述多个候选地图语义特征进行匹配,得到匹配对。
在步骤606中,电子设备基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对。
在一些实施例中,所述将所述多个观测语义特征和所述多个候选地图语义特征进行匹配前,所述方法还包括:
去除所述车辆传感器的盲区内的候选地图语义特征。
在一些实施例中,所述匹配对满足如下条件1至4:
条件1:候选地图语义特征和与其匹配的观测语义特征一一对应。
条件2:候选地图语义特征的语义标签和与其匹配的观测语义特征的语义标签相同。
条件3:候选地图语义特征和与其匹配的观测语义特征之间的语义-欧式距离小于或等于欧式距离阈值;所述语义-欧式距离用于表征一个候选地图语义特征和一个观测语义特征的相近程度;所述欧式距离阈值和所述候选地图语义特征与候选定位信息之间的欧式距离反相关。
条件4:候选地图语义特征和与其匹配的观测语义特征之间的语义-欧式距离在所述候选地图语义特征对应的所有语义-欧式距离中最小,且在所述观测语义特征对应的所有语义-欧式距离中最小。
在一些实施例中,所述匹配对还满足如下条件5:
条件5:若候选地图语义特征具有上级语义特征,则所述候选地图语义特征的上级语义特征的语义标签和与其匹配的观测语义特征的上级语义特征的语义标签相同。
在一些实施例中,所述候选地图语义特征的上级语义特征和与其匹配的观测语义特征的上级语义特征之间的语义-欧式距离小于或等于所述欧式距离阈值。
在一些实施例中,任一候选地图语义特征与任一观测语义特征的语义-欧式距离通过以下方式确定:
若该观测语义特征的语义标签与该候选地图语义特征的语义标签不同,则所述语义-欧式距离为无效值INF;
若该观测语义特征的语义标签与该候选地图语义特征的语义标签相同:
确定该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离和 欧式距离阈值;
若该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离小于或等于所述欧式距离阈值,则所述语义-欧式距离为该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离;
若该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离大于所述欧式距离阈值,则所述语义-欧式距离为无效值INF。
在一些实施例中,所述欧式距离阈值通过下式确定:
th=th 0×f(t)
其中,th为欧式距离阈值,th 0为设置的固定先验阈值,t为候选地图语义特征与候选定位信息之间的欧式距离,f(t)为与t反相关的映射函数。
在一些实施例中,若观测语义特征的语义标签与候选地图语义特征的语义标签相同,且所述观测语义特征或所述候选地图语义特征具有上级语义特征,则判断所述观测语义特征的上级语义特征的语义标签与所述候选地图语义特征的上级语义特征的语义标签是否相同,若不同,则确定所述观测语义特征与所述地图语义特征之间的语义-欧式距离为无效值INF。
在一些实施例中,所述将所述多个观测语义特征和所述多个候选地图语义特征进行匹配,得到匹配对包括:
确定所述多个观测语义特征和所述多个候选地图语义特征构成的语义-欧式距离矩阵;
基于所述语义-欧式距离矩阵,确定距离排序矩阵;所述距离排序矩阵中每个元素为二元组,所述二元组表示语义-欧式距离所在行和列的排序;
将所述距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对。
在一些实施例中,所述将所述距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对后,所述方法还包括:
将所述二元组为(1,1)所对应的行和列的所有元素均修改为无效值INF,并更新距离排序矩阵;
将更新后的距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对;
重复执行以上两个步骤,直至更新后的距离排序矩阵中没有二元组为(1,1)。
在一些实施例中,基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对包括:
选取匹配对数量最多的候选定位信息为所述车辆的最优候选定位信息。
在一些实施例中,基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对包括:
基于不同候选地图语义特征对于车辆定位的先验贡献度和每个所述候选定位信息对应的匹配对,确定每个所述候选定位信息的评价值;
选取评价值最高的候选定位信息为所述车辆的最优候选定位信息。
在一些实施例中,所述候选定位信息的评价值通过下式确定:
score=λ c∑c id∑f(d i)
其中,score为候选定位信息的评价值,λ c和λ d为先验分配权重,c i为第i个匹配对中候选地图语义特征对于车辆定位的先验贡献度,d i为第i个匹配对的语义-欧式距离,f(d i)为与d i反相关的映射函数。
在一些实施例中,选取评价值最高的候选定位信息后,所述方法还包括:
判断最高的评价值是否小于评价值阈值,若小于,则确定地图匹配失败。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员能够理解,本申请实施例并不受所描述的动作顺序的限制,因为依据本申请实施例,某些步骤可以采用其他顺序或者同时进行。另外,本领域技术人员能够理解,说明书中所描述的实施例均属于可选实施例。
本申请实施例还提出一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储程序或指令,所述程序或指令使计算机执行如地图匹配方法各实施例的步骤,为避免重复描述,在此不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
本领域的技术人员能够理解,尽管在此所述的一些实施例包括其它实施例中所包括的某些特征而不是其它特征,但是不同实施例的特征的组合意味着处于本申请的范围之内并且形成不同的实施例。
本领域的技术人员能够理解,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
虽然结合附图描述了本申请的实施方式,但是本领域技术人员可以在不脱离本申请的精神和范围的情况下做出各种修改和变型,这样的修改和变型均落入由所附权利要求所限定的范围之内。
工业实用性
本申请实施例适用于矢量语义地图,仅利用矢量语义地图的矢量信息(例如地图语义特征的信息),实现观测特征与地图特征的实时匹配,不依赖于额外的描述子、强度等额外数据,在降低储存需求和算力使用的基础上,实现良好的匹配效果。另外,本申请实施例对传感器类型没有特殊要求(摄像头、激光雷达等均可以适用)。具有工业实用性。

Claims (17)

  1. 一种地图匹配方法,包括:
    获取车辆的初始定位信息和车辆传感器数据;
    基于所述初始定位信息,确定多个候选定位信息;
    基于所述车辆传感器数据,确定多个观测语义特征;
    基于所述初始定位信息,获取局部地图信息,所述局部地图信息包括多个地图语义特征;
    针对每个所述候选定位信息:
    基于该候选定位信息,将所述多个地图语义特征转换至所述车辆传感器的坐标系下,得到该坐标系下的多个候选地图语义特征;
    将所述多个观测语义特征和所述多个候选地图语义特征进行匹配,得到匹配对;
    基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对。
  2. 根据权利要求1所述的方法,其中,所述将所述多个观测语义特征和所述多个候选地图语义特征进行匹配前,所述方法还包括:去除所述车辆传感器的盲区内的候选地图语义特征。
  3. 根据权利要求1所述的方法,其中,所述匹配对满足如下条件:
    候选地图语义特征和与其匹配的观测语义特征一一对应;
    候选地图语义特征的语义标签和与其匹配的观测语义特征的语义标签相同;
    候选地图语义特征和与其匹配的观测语义特征之间的语义-欧式距离小于或等于欧式距离阈值;所述语义-欧式距离用于表征一个候选地图语义特征和一个观测语义特征的相近程度;所述欧式距离阈值和所述候选地图语义特征与候选定位信息之间的欧式距离反相关;
    候选地图语义特征和与其匹配的观测语义特征之间的语义-欧式距离在所述候选地图语义特征对应的所有语义-欧式距离中最小,且在所述观测语义特征对应的所有语义-欧式距离中最小。
  4. 根据权利要求3所述的方法,其中,所述匹配对还满足如下条件:
    若候选地图语义特征具有上级语义特征,则所述候选地图语义特征的上级语义特征的语义标签和与其匹配的观测语义特征的上级语义特征的语义标签相同。
  5. 根据权利要求4所述的方法,其中,所述候选地图语义特征的上级语义特征和与其匹配的观测语义特征的上级语义特征之间的语义-欧式距离小于或等于所述欧式距离阈值。
  6. 根据权利要求3所述的方法,其中,任一候选地图语义特征与任一观测语义特征的语义-欧式距离通过以下方式确定:
    若该观测语义特征的语义标签与该候选地图语义特征的语义标签不同,则所述语义-欧式距离为 无效值INF;
    若该观测语义特征的语义标签与该候选地图语义特征的语义标签相同:
    确定该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离和欧式距离阈值;
    若该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离小于或等于所述欧式距离阈值,则所述语义-欧式距离为该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离;
    若该观测语义特征对应的坐标值和该候选地图语义特征对应的坐标值之间的欧式距离大于所述欧式距离阈值,则所述语义-欧式距离为无效值INF。
  7. 根据权利要求3所述的方法,其中,所述欧式距离阈值通过下式确定:
    th=th 0×f(t)
    其中,th为欧式距离阈值,th 0为设置的固定先验阈值,t为候选地图语义特征与候选定位信息之间的欧式距离,f(t)为与t反相关的映射函数。
  8. 根据权利要求6所述的方法,其中,若观测语义特征的语义标签与候选地图语义特征的语义标签相同,且所述观测语义特征或所述候选地图语义特征具有上级语义特征,则判断所述观测语义特征的上级语义特征的语义标签与所述候选地图语义特征的上级语义特征的语义标签是否相同,若不同,则确定所述观测语义特征与所述地图语义特征之间的语义-欧式距离为无效值INF。
  9. 根据权利要求3所述的方法,其中,所述将所述多个观测语义特征和所述多个候选地图语义特征进行匹配,得到匹配对包括:
    确定所述多个观测语义特征和所述多个候选地图语义特征构成的语义-欧式距离矩阵;
    基于所述语义-欧式距离矩阵,确定距离排序矩阵;所述距离排序矩阵中每个元素为二元组,所述二元组表示语义-欧式距离所在行和列的排序;
    将所述距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对。
  10. 根据权利要求9所述的方法,其中,所述将所述距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对后,所述方法还包括:
    将所述二元组为(1,1)所对应的行和列的所有元素均修改为无效值INF,并更新距离排序矩阵;
    将更新后的距离排序矩阵中二元组为(1,1)所对应的观测语义特征和候选地图语义特征,确定为匹配对;
    重复执行以上两个步骤,直至更新后的距离排序矩阵中没有二元组为(1,1)。
  11. 根据权利要求1所述的方法,其中,基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对包括:
    选取匹配对数量最多的候选定位信息为所述车辆的最优候选定位信息。
  12. 根据权利要求1所述的方法,其中,基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对包括:
    基于不同候选地图语义特征对于车辆定位的先验贡献度和每个所述候选定位信息对应的匹配对,确定每个所述候选定位信息的评价值;
    选取评价值最高的候选定位信息为所述车辆的最优候选定位信息。
  13. 根据权利要求12所述的方法,其中,所述候选定位信息的评价值通过下式确定:
    score=λ c∑c id∑f(d i)
    其中,score为候选定位信息的评价值,λ c和λ d为先验分配权重,c i为第i个匹配对中候选地图语义特征对于车辆定位的先验贡献度,d i为第i个匹配对的语义-欧式距离,f(d i)为与d i反相关的映射函数。
  14. 根据权利要求12所述的方法,其中,选取评价值最高的候选定位信息后,所述方法还包括:
    判断最高的评价值是否小于评价值阈值,若小于,则确定地图匹配失败。
  15. 一种地图匹配装置,包括:
    获取单元,用于获取车辆的初始定位信息和车辆传感器数据;基于所述初始定位信息,获取局部地图信息,所述局部地图信息包括多个地图语义特征;
    第一确定单元,用于基于所述初始定位信息,确定多个候选定位信息;基于所述车辆传感器数据,确定多个观测语义特征;
    匹配单元,用于针对每个所述候选定位信息:
    基于该候选定位信息,将所述多个地图语义特征转换至所述车辆传感器的坐标系下,得到该坐标系下的多个候选地图语义特征;
    将所述多个观测语义特征和所述多个候选地图语义特征进行匹配,得到匹配对;
    第二确定单元,用于基于每个所述候选定位信息对应的匹配对,确定所述车辆的最优候选定位信息和所述最优候选定位信息对应的匹配对。
  16. 一种电子设备,包括:处理器和存储器;所述处理器通过调用所述存储器存储的程序或指令,用于执行如权利要求1所述方法的步骤。
  17. 一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储程序或指令,所述程序或指令使计算机执行如权利要求1所述方法的步骤。
PCT/CN2020/123162 2020-09-27 2020-10-23 一种地图匹配方法、装置、电子设备和存储介质 WO2022062019A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP20954865.0A EP4206610A4 (en) 2020-09-27 2020-10-23 CARD MATCHING METHOD AND APPARATUS, ELECTRONIC DEVICE AND RECORDING MEDIUM
KR1020237009778A KR20230086664A (ko) 2020-09-27 2020-10-23 맵 매칭 방법, 장치, 전자기기 및 저장 매체
JP2023516484A JP2023541167A (ja) 2020-09-27 2020-10-23 マップマッチング方法、装置、電子機器および記憶媒体
US18/028,565 US20230358547A1 (en) 2020-09-27 2020-10-23 Map matching method and apparatus, electronic device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011031457.0A CN112179359B (zh) 2020-09-27 2020-09-27 一种地图匹配方法、装置、电子设备和存储介质
CN202011031457.0 2020-09-27

Publications (1)

Publication Number Publication Date
WO2022062019A1 true WO2022062019A1 (zh) 2022-03-31

Family

ID=73944130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/123162 WO2022062019A1 (zh) 2020-09-27 2020-10-23 一种地图匹配方法、装置、电子设备和存储介质

Country Status (6)

Country Link
US (1) US20230358547A1 (zh)
EP (1) EP4206610A4 (zh)
JP (1) JP2023541167A (zh)
KR (1) KR20230086664A (zh)
CN (1) CN112179359B (zh)
WO (1) WO2022062019A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116105603B (zh) * 2023-04-13 2023-09-19 安徽蔚来智驾科技有限公司 用于确定移动物体在场所中的位置的方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100324815A1 (en) * 2009-06-18 2010-12-23 Tomoaki Hiruta Position detection apparatus and position detection program
CN107144285A (zh) * 2017-05-08 2017-09-08 深圳地平线机器人科技有限公司 位姿信息确定方法、装置和可移动设备
CN108225346A (zh) * 2016-12-15 2018-06-29 现代自动车株式会社 车辆定位装置和方法
CN111323004A (zh) * 2018-12-16 2020-06-23 北京初速度科技有限公司 初始位置的确定方法及车载终端
CN111323029A (zh) * 2018-12-16 2020-06-23 北京初速度科技有限公司 导航方法及车载终端

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102313547B (zh) * 2011-05-26 2013-02-13 东南大学 基于手绘轮廓语义地图的移动机器人视觉导航方法
CN108732582B (zh) * 2017-04-20 2020-07-10 百度在线网络技术(北京)有限公司 车辆定位方法和装置
CN107990899B (zh) * 2017-11-22 2020-06-30 驭势科技(北京)有限公司 一种基于slam的定位方法和系统
CN109145171B (zh) * 2018-07-23 2020-09-08 广州市城市规划勘测设计研究院 一种多尺度地图数据更新方法
US10983526B2 (en) * 2018-09-17 2021-04-20 Huawei Technologies Co., Ltd. Method and system for generating a semantic point cloud map
CN109733383B (zh) * 2018-12-13 2021-07-20 初速度(苏州)科技有限公司 一种自适应的自动泊车方法及系统
CN110807412B (zh) * 2019-10-30 2022-09-23 驭势科技(北京)有限公司 一种车辆激光定位的方法、车载设备和存储介质
CN110866079B (zh) * 2019-11-11 2023-05-05 桂林理工大学 一种智慧景区实景语义地图的生成与辅助定位方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100324815A1 (en) * 2009-06-18 2010-12-23 Tomoaki Hiruta Position detection apparatus and position detection program
CN108225346A (zh) * 2016-12-15 2018-06-29 现代自动车株式会社 车辆定位装置和方法
CN107144285A (zh) * 2017-05-08 2017-09-08 深圳地平线机器人科技有限公司 位姿信息确定方法、装置和可移动设备
CN111323004A (zh) * 2018-12-16 2020-06-23 北京初速度科技有限公司 初始位置的确定方法及车载终端
CN111323029A (zh) * 2018-12-16 2020-06-23 北京初速度科技有限公司 导航方法及车载终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4206610A4 *

Also Published As

Publication number Publication date
CN112179359A (zh) 2021-01-05
CN112179359B (zh) 2022-09-23
JP2023541167A (ja) 2023-09-28
US20230358547A1 (en) 2023-11-09
EP4206610A4 (en) 2024-02-28
EP4206610A1 (en) 2023-07-05
KR20230086664A (ko) 2023-06-15

Similar Documents

Publication Publication Date Title
US11500063B2 (en) Deep learning for object detection using pillars
US11531110B2 (en) LiDAR localization using 3D CNN network for solution inference in autonomous driving vehicles
US11364931B2 (en) Lidar localization using RNN and LSTM for temporal smoothness in autonomous driving vehicles
US10035508B2 (en) Device for signalling objects to a navigation module of a vehicle equipped with this device
WO2020154970A1 (en) Deep learning–based feature extraction for lidar localization of autonomous driving vehicles
US10072936B2 (en) Estimating a street type using sensor-based surroundings data
US11814039B2 (en) Vehicle operation using a dynamic occupancy grid
WO2022095023A1 (zh) 一种交通流信息的确定方法、装置、电子设备和存储介质
US20200361452A1 (en) Vehicles and methods for performing tasks based on confidence in accuracy of module output
CN110632617A (zh) 一种激光雷达点云数据处理的方法及装置
WO2022062019A1 (zh) 一种地图匹配方法、装置、电子设备和存储介质
CN112902911B (zh) 基于单目相机的测距方法、装置、设备及存储介质
WO2019220765A1 (ja) 自己位置推定装置
EP3989031B1 (en) Systems and methods for fusing road friction data to enhance vehicle maneuvering
US20230168100A1 (en) Automatic annotation of drivable road segments
WO2021164514A1 (zh) 目标识别方法及装置
CN114817765A (zh) 基于地图的目标航向消歧
CN114148344B (zh) 一种车辆行为预测方法、装置及车辆
US20230126333A1 (en) Scan matching and radar pose estimator for an autonomous vehicle based on hyper-local submaps
CN115482679B (zh) 一种自动驾驶盲区预警方法、装置和消息服务器
US20240142590A1 (en) Online sensor alignment using feature registration
EP4095009B1 (en) Method and device for operating a self-driving car
US20220349728A1 (en) System and method
JP2023070183A (ja) 単眼深度推定用のニューラルアーキテクチャ探索システム及びその使用方法
JP2022148395A (ja) 運転支援方法及び運転支援装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20954865

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023516484

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020954865

Country of ref document: EP

Effective date: 20230330

NENP Non-entry into the national phase

Ref country code: DE