CN113375679A - Lane level positioning method, device and system and related equipment - Google Patents

Lane level positioning method, device and system and related equipment Download PDF

Info

Publication number
CN113375679A
CN113375679A CN202010161665.6A CN202010161665A CN113375679A CN 113375679 A CN113375679 A CN 113375679A CN 202010161665 A CN202010161665 A CN 202010161665A CN 113375679 A CN113375679 A CN 113375679A
Authority
CN
China
Prior art keywords
lane
probability
sides
vehicle
position point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010161665.6A
Other languages
Chinese (zh)
Inventor
杨威
毛灵飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010161665.6A priority Critical patent/CN113375679A/en
Publication of CN113375679A publication Critical patent/CN113375679A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a lane-level positioning method, a lane-level positioning device, a lane-level positioning system and related equipment. Wherein, the method comprises the following steps: positioning the current position of the vehicle, and randomly generating a preset number of position points around the current position; the probability that each position point is possibly the real position of the vehicle is respectively determined aiming at each position point, the probability that each position point is possibly the real position of the vehicle in each mode is respectively determined in multiple modes such as electronic map errors, lane line types at two sides of a lane where the vehicle is located and distances from the lane lines at two sides, which are sensed by at least two sensing technologies, and the like when the probability is determined, the probabilities obtained in the multiple modes are fused to obtain the final probability, and the position of the vehicle is positioned based on the final probability. The method and the device solve the problem of low accuracy when a single mode is adopted for positioning, improve the positioning accuracy of vehicle positioning and the robustness under different scenes, realize lane-level accurate positioning, can adapt to the positioning requirements of different scenes at different levels, and reduce the influence of external factors on the positioning accuracy.

Description

Lane level positioning method, device and system and related equipment
Technical Field
The invention relates to the technical field of intelligent traffic, in particular to a lane-level positioning method, a lane-level positioning device, a lane-level positioning system and related equipment.
Background
In the intelligent transportation technology, in order to realize automatic driving or intelligent auxiliary driving, the position of a vehicle needs to be accurately positioned. At present, most of vehicle positioning methods rely on a global navigation satellite system and an inertial navigation system, on one hand, the positioning accuracy of the systems cannot reach the lane-level positioning accuracy required by intelligent traffic, and on the other hand, the systems cannot continuously position the position of the vehicle after losing signals in certain environments.
The existing lane-level positioning method for vehicles comprises the following steps: positioning based on a carrier-phase differential technology, also called a Real-Time Kinematic (RTK) technology, positioning based on a visual perception technology such as a mobile eye (Mobileye) vision hardware, positioning based on a Light Detection And Ranging (LIDAR) technology, And the like. However, these positioning modes are based on a single mode to position the lane where the vehicle is located, and cannot achieve the positioning accuracy required by automatic driving and intelligent auxiliary driving, and due to the adoption of the single positioning mode, the positioning requirements of different levels cannot be met, the robustness of different scenes cannot be adapted to, and the accuracy and recall rate of the main and auxiliary road recognition are also low. The robustness refers to stability under different situations, such as different external conditions and weather conditions, and different scenes, such as tunnels, viaducts, canyons, cities and the like.
Therefore, the positioning accuracy and adaptability of the existing vehicle positioning technology cannot meet the positioning requirements of automatic driving and intelligent auxiliary driving, and how to accurately position the vehicle and realize lane-level accurate positioning becomes a technical problem to be solved urgently.
Disclosure of Invention
In view of the technical drawbacks and disadvantages of the prior art, embodiments of the present invention provide a lane-level positioning method, apparatus, system and related device that overcome or at least partially solve the above problems.
As an aspect of an embodiment of the present invention, a lane-level positioning method may include:
locating a current position of a vehicle, randomly generating a predetermined number of location points around the current position;
determining a first probability of each position point being located in each lane according to the position of the position point, the error of the electronic map and the position range of each lane obtained from map data;
acquiring lane line types at two sides of a lane where a vehicle is located and distances from the lane line types at two sides, which are perceived by at least two perception technologies, matching the lane line types at two sides of the lane where each position point is located and the distances from the lane line types at two sides of the lane where the position point is located, which are acquired from map data, and determining a second probability that each position point is located in each lane;
and determining the final probability of each position point in each lane according to the first probability and the second probability, and determining the lane with the maximum final probability as the lane in which the vehicle is located.
As another aspect of the embodiments of the present invention, a lane-level positioning apparatus may include:
the current position acquisition module is used for positioning the current position of the vehicle;
a location point generating module for randomly generating a predetermined number of location points around the current location;
the first probability determination module is used for determining a first probability that each position point is located in each lane according to the position of the position point, the error of the electronic map and the position range of each lane obtained from map data;
the second probability determination module is used for acquiring the lane line types at two sides of the lane where the vehicle is located and the distances from the lane line types at two sides, which are sensed by at least two sensing technologies, matching the lane line types at two sides of the lane where each position point is located and the distances from the lane line types at two sides, which are acquired from the map data, and determining the second probability that each position point is located in each lane;
and the lane determining module is used for determining the final probability of each position point in each lane according to the first probability and the second probability, and determining the lane with the maximum final probability as the lane in which the vehicle is located.
As a third aspect of the embodiments of the present invention, a vehicle positioning system may include: the system comprises a server, a vehicle-mounted positioning terminal and a sensing terminal;
the perception terminal is used for perceiving the line type of the lane lines on the two sides of the lane where the vehicle is located and the distance from the lane lines on the two sides;
the vehicle-mounted positioning terminal is used for positioning the current position of the vehicle;
the server comprises the lane-level positioning device of the second aspect.
As a fourth aspect of the embodiments of the present invention, it relates to a computer-readable storage medium on which a computer program is stored, wherein the program, when executed by a processor, implements the lane-level localization method described above.
As a fifth aspect of the embodiments of the present invention, it relates to a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the lane-level localization method as described above.
The embodiment of the invention at least realizes the following technical effects:
the method comprises the steps of randomly generating a preset number of position points around the current position of a positioned vehicle, respectively determining the probability that each position point is possibly the real vehicle position according to each position point, respectively determining the probability that each position point is possibly the real vehicle position in each mode based on electronic map errors, at least two sensing technologies, such as lane line types at two sides of a lane where the vehicle is located, distances from lane lines at two sides and the like, and fusing the probabilities obtained in the multiple modes to obtain a final probability, and positioning the position of the vehicle based on the final probability. The problem of low accuracy when adopting single mode location is overcome, has improved the positioning accuracy of vehicle location, has realized the accurate location of lane level, and can adapt to the different rank location demands of different scenes, has reduced the influence of external factor to the location accuracy, has improved the robustness of vehicle location under the different scenes, has also improved the rate of accuracy and the recall rate of major-minor way discernment simultaneously.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of a lane-level positioning method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating DR technology principle;
FIG. 3 is a detailed flowchart of the implementation of step S11 in FIG. 1;
FIG. 4 is a diagram illustrating an example of randomly generated location points in an embodiment of the present invention;
FIG. 5 is a detailed flowchart of the implementation of step S13;
FIG. 6 is an exemplary illustration of determining a first probability for a lateral position range for each location point;
FIG. 7 is a flowchart illustrating the first implementation manner of step S14;
fig. 8 is a schematic diagram of the determination of the line type probability and the distance probability in step S14;
FIG. 9 is a flowchart illustrating the second implementation manner of step S14;
fig. 10 is a schematic structural diagram of a lane-level positioning device according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a lane-level positioning system according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The present embodiment provides a lane-level positioning method, which may include the following steps, as shown in fig. 1:
and step S11, positioning the current position of the vehicle.
And carrying out primary positioning on the vehicle to obtain the current position of the vehicle. More than two positioning technologies can be selected to carry out primary positioning on the vehicle, and fusion processing is carried out on the primary positioning position to obtain the current position of the vehicle.
Step S12, a predetermined number of location points are randomly generated around the current location.
The method comprises the steps of generating a preset number of position points around the current position based on a preset estimation algorithm, wherein the position points can be set in a Gaussian distribution mode, and each position point can be the position of a vehicle.
Step S13, determining a first probability that each position point is located in each lane according to the position of the position point, the error of the electronic map, and the position range of each lane acquired from the map data.
In the step, because the space coordinates are projected on the electronic map with a certain error, an error offset exists in each lane of a position point generated by the current position of the vehicle, and the step is to calculate the probability that each position point is located in each lane.
And step S14, acquiring the line type of the lane lines at the two sides of the lane where the vehicle is located and the distance from the line to the lane lines at the two sides, which are sensed by at least two sensing technologies, matching the line type of the lane where each position point is located and the distance from the line to the lane lines at the two sides, which are acquired from the map data, and determining a second probability that each position point is located in each lane.
Because the existing perception technology has certain errors, the line types of the lane lines on the two sides of the lane where the recognized vehicle is located have certain errors, and the distances from the lane lines on the two sides can be recognized but are not equal to the distances from each position point to the two sides. And (3) matching probability exists during matching, and the step is to calculate the probability of each position point being matched with the vehicle when the position point is positioned in each lane.
The step S13 and the step S14 are not sequentially executed, and the step S13 is executed first, and then the step S14 is executed, or the step S14 is executed first, and then the step S13 is executed, or the step S13 and the step S14 are executed simultaneously, which is not limited in the embodiment of the present invention.
And step S15, determining the final probability that each position point is positioned in each lane according to the first probability and the second probability.
And step S16, determining the lane with the maximum final probability as the lane where the vehicle is located.
The method and the device for determining the probability of the vehicle position randomly generate a preset number of position points around the current position of the positioned vehicle, determine the probability of the position points possibly being the real vehicle position respectively aiming at each position point, determine the probability of the position points possibly being the real vehicle position respectively in various modes such as lane line types at two sides of a lane where the vehicle is positioned and distances from the lane lines at the two sides on the basis of electronic map errors, at least two sensing technologies and the like when determining the probability, fuse the probabilities obtained in the various modes to obtain the final probability, and position the vehicle based on the final probability. The problem of low accuracy when adopting single mode location is overcome, has improved the positioning accuracy of vehicle location, has realized the accurate location of lane level, and can adapt to the different rank location demands of different scenes, has reduced the influence of external factor to the location accuracy, has improved the robustness of vehicle location under the different scenes, has also improved the rate of accuracy and the recall rate of major-minor way discernment simultaneously.
The above steps of the embodiments of the present invention are explained in detail as follows:
in the step S11, the current position of the vehicle may be located in various manners, such as by GPS navigation and beidou navigation, or may be located by Dead Reckoning (DR positioning for short) based on the inertial motion measurement principle. To avoid that these positioning methods cannot normally receive signals under certain specific circumstances, the embodiment of the present invention determines the current position of the vehicle by at least two positioning techniques.
The specific implementation manner of the step is as follows: and determining at least two initial positioning positions of the vehicle based on the selected at least two positioning technologies, and fusing the at least two initial positioning positions by adopting a preset position fusion party to obtain the current position of the vehicle.
In the embodiment of the invention, a primary positioning position of the vehicle is determined by a carrier phase differential technology (also called Real-Time Kinematic, RTK for short), wherein the RTK technology is a new and common GPS measurement method. In the RTK technology, the centimeter-level precision can be obtained only by performing solution afterwards, the RTK technology is a measurement method capable of obtaining centimeter-level positioning precision in real time in the field, three-dimensional coordinates of an observation point can be provided in real time, centimeter-level high precision is achieved, the pseudo-range difference principle is the same, and a reference station transmits carrier observed quantity and station coordinate information to a user station together in real time through a data link. The user station receives the carrier phase of the GPS satellite and the carrier phase from the reference station, and forms a phase difference observation value for real-time processing, so that a centimeter-level positioning result can be given in real time.
The embodiment of the invention determines the other initial positioning position of the vehicle by the DR positioning technology, and the DR positioning technology is different from the traditional positioning technology which adopts a GPS module to receive navigation satellite data, and the DR positioning technology calculates the position and the track of the target by measuring the real motion, so the DR positioning technology has small environmental influence factors and can well make up the positioning condition under the condition of poor or lost GPS signal reception. However, as with most algorithms, because the positioning sensor itself has errors, the positioning results gradually diverge as time progresses, and the final positioning is inaccurate.
Specifically, the previous-time position of the vehicle may be determined as the initial position of the vehicle by RTK technology or a GPS system. For example, when a user drives a vehicle and a satellite signal is unstable, a satellite positioning signal at a time immediately before the satellite signal is unstable is measured and is recorded as (x) as an initial position of the vehicle0,y0)。
Then, a traveling direction and a traveling distance of the vehicle from the initial position are determined from the sensing data of the inertial sensor on the vehicle. The inertial sensor is a sensor, mainly used for detecting and measuring acceleration, inclination, impact, vibration, rotation and multiple degrees of freedom (DoF) motion, and is an important part for solving navigation, orientation and motion carrier control. For example, a common inertial sensor such as a gyroscope, an accelerometer, etc. obtains data of angular velocity, linear velocity, etc. of the vehicle, and then obtains position information of the vehicle by integration.
And finally, determining the first current position according to the initial position, the driving direction and the driving distance. As shown in fig. 2, the travel distance S of the vehicle during travel can be obtained by the above-described inertial sensor0,S1,S2,S3… …, and angle value theta and angle change value delta theta of driving direction0、△θ1、△θ2、△θ3… …, respectively; because of the vehicle initial position (x)0,y0) It is known that the coordinates (x) of the vehicle at each point can be calculated by the following formula1,y1)、(x2,y2) And the coordinates of the first current position are calculated as (x)D,yD)。
x1=x0+S0*sinθ0;x2=x1+S1*sinθ1
y1=y0+S0*cosθ0;y2=y1+S1*cosθ1
θ1=θ0+△θ0;θ2=θ1+△θ1
Figure BDA0002406019630000081
θk=θk-1+△θk-1k=1,2,3……
In an embodiment of the present invention, the preliminary positioning position of the absolute positioning obtained by the RTK technique and the DR technique is fused by a preselected position fusion method, for example, the preliminary positioning position is fused by a kalman filter technique, as shown in fig. 3, the method specifically includes the following steps:
and step S111, determining a Kalman gain coefficient according to the variance of each primary positioning position.
The Kalman Filtering (Kalman Filtering) is an algorithm that performs optimal estimation of a system state by inputting and outputting observation data through a system using a linear system state equation. The optimal estimation can also be seen as a filtering process, since the observed data includes the effects of noise and interference in the system. The fusion is to combine two different data together by using a Kalman filtering technology to obtain the most suitable quantized data, and the Kalman filtering tries to remove the influence of noise by using the dynamic information of the target to obtain an optimal estimation about the target position, wherein the estimation can be the estimation (filtering) of the current target position, the estimation (prediction) of the future position or the estimation (interpolation or smoothing) of the past position.
For example, determining a preliminary positioning location of the vehicle by the RTK positioning technique described above has coordinates of (x)R,yR) The coordinate of another preliminary location position of the vehicle determined by the DR location technique is (x)D,yD) Will be (x) by Kalman filteringD,yD) And (x)R,yR) The fused position coordinate is (x)K,yK). Specifically, the kalman gain coefficient may be determined according to the variance of the first preliminary positioning location and the variance of the second preliminary positioning location. The kalman gain coefficient (also referred to as "empirical kalman filtering") refers to changing the gain coefficient by knowing empirically which conditions the variance of the sensor data is large and which conditions the variance is small.
For example, the Kalman gain factor is denoted as KiThe kalman gain coefficient may be obtained by the following equation: ki=(Pi-1+Q)/(Pi-1+ Q + R). Wherein, Pi-1Is the variance of the last estimate, Q is the variance of gaussian noise, R is the variance of the measurement, and Q and R are both constants. It can be known from the above formula that K is determined by the ratio of the two kinds of variances, since Q and R are constants, the variance of the last estimated value plays a decisive role, if the variance is found to be large after the last estimation, which means that the estimated spectrum is not too reliable, it is easy to know in this formula that K will become larger and closer to 1, and in combination with the last formula, the ratio of the measured value will be larger, that is, the estimated value of this time trusts the measured value more.
And step S112, performing weight distribution on each preliminary positioning position by using a Kalman gain coefficient.
Using a Kalman gain coefficient to carry out weight distribution on the first preliminary positioning position and the second preliminary positioning position to obtain fusion, and then: xi=(1-Ki)Xi-1+Ki*ZiThe variance of the fused position is Pi=(1-Ki)Pi-1. Wherein the first preliminary location position is marked as Xi-1The first preliminary positioning position is taken as an estimated value of the last moment, and the second preliminary positioning position ZiAnd taking the second preliminary positioning position as a measured value. The formula shows that the current estimated value is determined by the last estimated value and the current measured value, and the weight of the two estimated values is determined by KiDetermination of KiIs the kalman gain coefficient.
And S113, performing weighted calculation on the preliminary positioning position based on the distributed weight to obtain the current position of the vehicle. In this step, the position coordinates of the vehicle obtained through the fusion are finally (x)K,yK)。
In the embodiment of the present invention, in the step S11, the initial positioning is performed by at least two technologies, and then the at least two initial positioning positions are fused by the kalman filtering technology to obtain the absolute position of the vehicle, so that the influence that the positioning signal cannot be received due to weak signal is avoided, and further, the positioning result is fused by multiple positioning methods, so that the positioning result better conforms to the actual situation, and the positioning is more accurate.
In the above step S12, a predetermined number of position points are randomly generated around the current position.
The position points refer to a plurality of noisy points which are set in a target area in a Gaussian distribution mode when the position points are estimated through Kalman filtering based on the Bayesian estimation principle, and the noisy points are matched according to target characteristics. The preset number refers to the number which can be set according to actual needs, and the requirement of high-signal positioning can be met by setting 50-100 position points under general conditions.
Specifically, the step may include: determining the variance of the current position according to the variance of the primary positioning position and the Kalman gain coefficient; a predetermined number of location points are randomly generated around the current location based on the variance of the current location.
The variance P of the fused position calculated in the above stepi(ii) a Then, a predetermined number of position points are randomly generated around the fused position according to the variance of the fused position, thereby randomly generating the position points in a manner conforming to a gaussian distribution. For example, referring to fig. 4, a predetermined number of position points are randomly generated around the fused position.
In the above step S13, a first probability that each of the position points is located in each lane is determined based on the position of the position point, the error of the electronic map, and the position range of each lane obtained from the map data.
The coordinates of each position point can be obtained by randomly generating a preset number of position points, the width, the position and the like of each lane can be obtained according to the map data, and when the vehicle is positioned every time, because the electronic map has a certain error, the probability that each position point is positioned in each lane can be determined according to the data, namely the first probability.
Specifically, as shown in fig. 5, the method may include the following steps:
and S131, determining the transverse position range of the position point on the electronic map according to the error of the electronic map and the position of the position point.
Wherein, the transverse position range refers to a position range in a vertical direction along a driving direction of the lane; the error of the electronic map is an error generated when each position point is projected from an actual spatial position onto the electronic map, and may be specifically determined according to actual conditions and historical empirical values, for example, on a common highway, the road width is 3.5m, and the error of the electronic map may be set to 1m on the left and right.
And S132, determining a first probability that each position point is located in each lane in the electronic map according to the transverse position range and the position range of each lane.
For example, referring to fig. 6, the coordinate value of a certain position point can be obtained through the gaussian distribution, and for the calculation of the probability of which lane is located, it is only necessary to determine the lateral projection coordinate of the position point, and it is not necessary to consider the coordinates of other directions, if the distance between the position point and the right edge of the second lane is 0.8m, and the error value is set to be 1m for each of the left and right according to the road width of 3.5m and the preset error value. The coverage of the second lane is 1.8m, the coverage of the third lane is 0.2m, the coverage of the first lane, the fourth lane and the fifth lane is 0m based on calculation, and the probability that the position point is positioned in the first lane, the fourth lane and the fifth lane is 0%; the probability of being located in the second lane is 1.8/(1.8+0.2), namely the probability of the position point being located in the second lane is 90%; the probability of being located in the third lane is 0.2/(1.8+0.2), namely the probability of being located in the third lane is 10%. Based on the above probability calculation principle, the probability that all the position points are located in each lane can be determined.
In step S14, the line types of the lane lines on both sides and the distances to the lane lines on both sides of the lane where the vehicle is located, which are sensed by at least two sensing technologies, are obtained, and the line types of the lane lines on both sides and the distances to the lane lines on both sides of the lane where each position point is located, which are obtained from the map data, are matched to determine the second probability that each position point is located in each lane.
The step can obtain the second probability that the vehicle is matched to each position point and is positioned on each lane in two modes, can obtain the line type of the lane line at two sides of the lane where the vehicle is positioned and the distance between the lane line at two sides by different sensing technologies, can respectively match the line type and the distance sensed by different technical modes with the line type and the distance corresponding to the position point respectively, and then fuses the matching results; or the line type and the distance sensed by different technologies can be fused firstly and then matched with the line type and the distance corresponding to the position point.
In the first embodiment, the matching is performed first and then the fusion is performed, and as shown in fig. 7, the method may include the following steps:
step S1411, respectively determining lane line types and distances to lane lines on two sides of a lane where the vehicle is located according to the shot road image and the collected point cloud data.
The lane line type refers to different types of lane middle lines dividing lanes on a road. For example, the lane line type may include: single solid line, single dashed line, double solid line, double dashed line, etc.; the distance from the vehicle to the lane lines on the two sides refers to the length of the vehicle from the left side and the right side of the lanes on the two sides along the transverse projection.
The embodiment of the invention can shoot the road image through visual perception technologies such as mobile eye (Mobile) visual hardware and the like, and can also shoot the point cloud image of the road based on the Lidar laser perception technology. The line type of the lane lines on two sides of the lane where the vehicle is located and the distance from the lane lines on two sides can be identified through the obtained image. In the embodiment of the invention, when the line type is identified by using the mobiley technology, the color of the lane line can be identified, and the color of the lane line is used as a matching parameter for registration to obtain the probability of matching between the vehicle and the position point, for example, the color of the existing lane line can be white, yellow and the like.
Step S1412, matching the lane line types on both sides of the lane where the vehicle is located and the distances to the lane lines on both sides, which are determined according to the road image, with the lane line types on both sides of the lane where each position point is located and the distances to the lane lines on both sides, which are obtained from the map data, to obtain a third probability that each position point is located in each lane.
Since the line type accuracy of the lane line identified by the mobiley vision technique is not completely matched with the lane line on the actual high-precision map, the accuracy of the identified lane line is not necessarily required, for example, the accuracy of the mobiley vision output provided by the embodiment of the invention is 95%. The distance to the lane line identified by the mobiley vision technology is certain, but the distances of the lane lines on two sides of different position points are not the same. Therefore, the third probability is a fusion of two probabilities, i.e., a linear probability and a distance probability. If the line type of the lane where the vehicle is located is the left solid line and the right dotted line through the mobileye vision technology, the distance mu between the vehicle and the left lane line is 1 m.
The line type probability may be calculated based on the line type of the lane line on both sides of the lane where each position point is located and the distance to the two lane lines, which are acquired from the map data. And matching the line types of the lane lines at the two sides of the lane where the position point is located with the line type of the lane line where the vehicle is identified by the mobiley vision technology to obtain the probability of whether the line types of the lane lines at the two sides of the lane where the position point is located are the identified line type, and finally obtaining the line type probability of the position point located in each lane. For example, without considering the single solid line and the double solid line, referring to fig. 8, the probability of line type matching for the lane where the position point a is located is 95% by 95%, and the probability of line type matching for the lane where the position point B is located is 95% by 5%.
The distance probability can be calculated by adopting a selected probability estimation calculation method according to the distance between the position points acquired from the map data and the lane lines on the two sides and the difference value between the distance between the vehicle and the corresponding lane line identified by the mobile vision technology, so as to obtain the distance probability of each position point on each lane. Referring also to fig. 8, it can be obtained that, for example, the distance between the position point a and the left lane line is 2.8m, and the distance between the position point B and the left lane line is 1.7m according to the coordinates of the lane where the position point is located. The probability that the vehicle coincides with the position point a or the position point B can be calculated according to the formula (1).
Figure BDA0002406019630000121
Wherein μ is the distance of the perceived vehicle from the left (or right) lane line; x is the distance from the location point to the left (or right) lane line; σ is a constant relating to the height and angle of the camera on the vehicle, and is generally an empirical value and an evaluation value.
And step S1413, matching the lane line types at the two sides of the lane where the vehicle is located and the distances from the lane line types to the two sides, which are determined according to the point cloud data, with the lane line types at the two sides of the lane where each position point is located and the distances from the lane line types to the two sides, which are acquired from the map data, so as to obtain a fourth probability that each position point is located in each lane.
The fourth probability that each position point is located in each lane in the step is calculated as in step S1412, which is different from step S1412 in that the accuracy rate of recognizing the line type of the lane line and the distance from the vehicle to the lane line are different from the accuracy rate and the distance recognized by the road image. Therefore, the probability that each position point is located in each lane obtained from the collected point cloud data is not necessarily the same as the probability that each position point is located in each lane obtained from the photographed road image.
And S1414, performing fusion processing on the third probability and the fourth probability of each position point to obtain a second probability that each position point is located in each lane.
And performing fusion processing on the third probability that the vehicle is matched with the position point obtained in the step S1412 and the fourth probability that the vehicle is matched with the position point obtained in the step S1413, for example, multiplying the two obtained probability values to obtain a second probability that the vehicle is matched with each position point and is located in each lane.
< second mode > the present mode is a mode of fusion and then matching, and as shown in fig. 9, may include the following steps:
step S1421, respectively determining the line type of the lane lines on the two sides of the lane where the vehicle is located and the distance to the lane lines on the two sides according to the shot road image and the collected point cloud data, and performing fusion processing on the determined line type and distance.
The method comprises the following steps of respectively fusing the lane line types at the two sides of the road where the vehicles are located and the distances from the lane lines at the two sides, which are identified by different technologies, and determining the lane line types at the two sides of the road where the vehicles are located and the distances from the lane lines at the two sides, which are fused by the different technologies. For example, the probability that the line type of the lane where the vehicle is located in the road image shot by the mobiley vision technology is the left solid line and the right dotted line is 95%, and the probability that the line type of the lane where the vehicle is located is the left solid line and the right dotted line determined by the point cloud data collected by the Lidar technology is 90%, the determined probabilities may be subjected to fusion processing, and for example, the probability that the line type of the lane where the vehicle is located is the left solid line and the right dotted line is 85.5% after multiplication. Similarly, the distance value after the distance fusion between the vehicle and the lane lines on the two sides, which is determined by two different technologies, can be calculated through distance averaging or in a preset mode.
In the step, the lane line types identified by different technologies and the distances to the lane lines on two sides are fused, so that error careless mistakes caused by a single algorithm are avoided, and the accuracy of positioning identification is further improved.
Step S1422, matching the lane line types on both sides of the lane where the vehicle is located and the distances to the lane lines on both sides with the lane line types on both sides of the lane where each position point is located and the distances to the lane lines on both sides, which are acquired from the map data, to obtain a second probability that each position point is located on each lane.
Specifically, this step can be shown with reference to fig. 9, and the implementation steps are as follows:
step S14221, obtaining the line type probability that the position point is located in each lane according to the line type of the lane line on both sides of the lane where the position point is located, which is obtained from the map data, and the probability that the lane line on both sides of the lane where the vehicle is located after the fusion processing is the line type.
Step S14222, obtaining the distance probability of the position point in each lane by using a selected probability estimation algorithm according to the difference between the distance from the position point to the lane lines on both sides and the distance from the position point to the lane line on the corresponding side after fusion processing, which are obtained from the map data.
Step S14223, respectively fusing the line type probability and the distance probability of each position point to obtain a second probability that each position point is located in each lane.
In this step, the detailed description of step S1412 may be referred to for a manner that the lane line type of the lane where the merged vehicle is located and the lane lines at the two sides have distances respectively matching the lane line types corresponding to the respective position points, which is not described herein again. It should be noted that, the execution sequence of the step S14221 and the step S14222 is not sequential, and the step S14221 is executed first, or the step S14222 is executed first, or the steps are executed simultaneously, which is not specifically limited in this embodiment of the present invention.
The method comprises the steps of firstly fusing positioning obtained by a dead reckoning technology and positioning obtained by a carrier phase differential technology based on Kalman filtering to obtain the current position of a vehicle, then randomly generating a preset number of position points according to the current position of the vehicle, respectively determining the probability that each position point is possibly the real position of the vehicle aiming at each position point, respectively determining the probability that each position point is possibly the real position of the vehicle in each mode based on various modes such as electronic map errors, lane line types at two sides of a lane where the vehicle is located and distances from lane lines at two sides, sensed by at least two sensing technologies, and the like when determining the probability, fusing the probabilities obtained in various modes to obtain the final probability, and positioning the position of the vehicle based on the final probability. The method not only overcomes the problem of low accuracy when a single mode is adopted for positioning, improves the positioning accuracy of vehicle positioning, realizes accurate positioning at lane level, can adapt to the positioning requirements of different scenes at different levels, reduces the influence of external factors on the positioning accuracy, meets the positioning requirements of different levels and the robustness of adapting to different scenes, but also improves the accuracy and recall rate of main and auxiliary road identification.
Based on the same inventive concept, an embodiment of the present invention provides a vehicle-to-level positioning apparatus, which, as shown in fig. 10, may include: the current position acquisition module 11, the position point generation module 12, the first probability determination module 13, the second probability determination module 14 and the lane determination module 15 work on the following principles:
the current position acquisition module 11 locates the current position of the vehicle. Specifically, the current position obtaining module 11 determines at least two preliminary positioning positions of the vehicle based on the selected at least two positioning technologies, and fuses the at least two preliminary positioning positions in a preselected position fusion manner to obtain the current position of the vehicle. More specifically, the current position obtaining module 11 uses a preselected position fusion mode to fuse the at least two preliminary positioning positions to obtain the current position of the vehicle, including: the current position obtaining module 11 determines a kalman gain coefficient according to the variance of each of the preliminary positioning positions; the current position obtaining module 11 uses the kalman gain coefficient to perform weight distribution on each preliminary positioning position, and the current position obtaining module 11 performs weighted calculation on the preliminary positioning position based on the distributed weight to obtain the current position of the vehicle.
The location point generation module 12 randomly generates a predetermined number of location points around the current location. Specifically, the location point generating module 12 determines the variance of the current location according to the variance of the preliminary location and the kalman gain coefficient; the location point generating module 12 randomly generates a predetermined number of location points around the current location according to the variance of the current location.
The first probability determination module 13 determines a first probability that each of the position points is located in each lane according to the position of the position point, the error of the electronic map, and the position range of each lane acquired from the map data. Specifically, the first probability determination module 13 determines a lateral position range of the position point on the electronic map according to the error of the electronic map and the position of the position point; the first probability determination module 13 determines a first probability that each of the location points is located in a respective lane in the electronic map according to the lateral location range and the location range of the respective lane.
The second probability determination module 14 obtains the line types of the lane lines on both sides of the lane where the vehicle is located and the distances to the lane lines on both sides, which are sensed by at least two sensing technologies, matches the line types of the lane lines on both sides of the lane where each position point is located and the distances to the lane lines on both sides, which are obtained from the map data, and determines the second probability that each position point is located in each lane.
Specifically, the second probability determination module 14 determines lane line types and distances to lane lines on two sides of a lane where a vehicle is located according to the captured road image and the collected point cloud data; the second probability determination module 14 matches the lane line types on the two sides of the lane where the vehicle is located and the distances to the lane lines on the two sides, which are determined according to the road image, with the lane line types on the two sides of the lane where each position point is located and the distances to the lane lines on the two sides, which are obtained from the map data, so as to obtain a third probability that each position point is located in each lane; the second probability determination module 14 matches the lane line types on the two sides of the lane where the vehicle is located and the distances to the lane lines on the two sides, which are determined according to the point cloud data, with the lane line types on the two sides of the lane where each position point is located and the distances to the lane lines on the two sides, which are obtained from the map data, so as to obtain a fourth probability that each position point is located in each lane; the second probability determining module 14 performs fusion processing on the third probability and the fourth probability of each position point to obtain a second probability that each position point is located in each lane.
Or, specifically, the second probability determining module 14 determines line types of lane lines on both sides of a lane where the vehicle is located and distances to the lane lines on both sides respectively according to the captured road image and the collected point cloud data, and performs fusion processing on the determined line types and distances; the second probability determination module 14 matches the lane line types on the two sides of the lane where the vehicle is located and the distances to the lane lines on the two sides with the lane line types on the two sides of the lane where each position point is located and the distances to the lane lines on the two sides, which are acquired from the map data, to obtain a second probability that each position point is located on each lane.
More specifically, the second probability determining module 14 obtains the line type probability that the position point is located in each lane according to the line type of the lane line on both sides of the lane where the position point is located, which is obtained from the map data, and the probability that the lane line on both sides of the lane where the vehicle is located after the fusion processing is the line type; the second probability determination module 14 obtains the distance probability of the position point in each lane by adopting a selected probability estimation algorithm according to the difference between the distance from the position point to the lane lines on both sides and the distance from the position point to the lane line on the corresponding side after fusion processing, which are obtained from the map data; the second probability determining module 14 respectively fuses the line type probability and the distance probability of each position point to obtain a second probability that each position point is located in each lane.
The lane determining module 15 determines a final probability that each of the position points is located in each lane according to the first probability and the second probability, and determines the lane with the maximum final probability as the lane where the vehicle is located.
For the technical effects and the related examples of the device of this embodiment, reference may be made to the related contents in the above method, and further description is omitted here.
Based on the same inventive concept, an embodiment of the present invention provides a lane-level positioning system, which, referring to fig. 11, may include: the system comprises a server 1, a vehicle-mounted positioning terminal 2 and a sensing terminal 3;
the sensing terminal 3 is used for sensing the lane line type and the distance between the lane line type and the lane line on two sides of the lane where the vehicle is located, and can comprise a camera device 31 and a point cloud acquisition device 32;
the vehicle-mounted positioning terminal 2 is used for positioning the current position of the vehicle;
the server 1 comprises a lane-level positioning device as described above.
For the technical effects and the related examples of the system of this embodiment, reference may be made to the related contents of the method embodiments described above, and details are not described herein again.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (12)

1. A lane-level positioning method, comprising:
locating a current position of a vehicle, randomly generating a predetermined number of location points around the current position;
determining a first probability of each position point being located in each lane according to the position of the position point, the error of the electronic map and the position range of each lane obtained from map data;
acquiring lane line types at two sides of a lane where a vehicle is located and distances from the lane line types at two sides, which are perceived by at least two perception technologies, matching the lane line types at two sides of the lane where each position point is located and the distances from the lane line types at two sides of the lane where the position point is located, which are acquired from map data, and determining a second probability that each position point is located in each lane;
and determining the final probability of each position point in each lane according to the first probability and the second probability, and determining the lane with the maximum final probability as the lane in which the vehicle is located.
2. The method of claim 1, the locating a current location of a vehicle, comprising:
and determining at least two initial positioning positions of the vehicle based on the selected at least two positioning technologies, and fusing the at least two initial positioning positions in a preselected position fusion mode to obtain the current position of the vehicle.
3. The method of claim 2, said fusing said at least two preliminary locations using a preselected location fusion to obtain a current location of the vehicle, comprising:
determining a Kalman gain coefficient according to the variance of each preliminary positioning position;
and performing weight distribution on each preliminary positioning position by using the Kalman gain coefficient, and performing weighted calculation on the preliminary positioning positions based on the distributed weights to obtain the current position of the vehicle.
4. The method of claim 3, the randomly generating a predetermined number of location points around the current location, comprising:
determining the variance of the current position according to the variance of the preliminary positioning position and the Kalman gain coefficient;
a predetermined number of location points are randomly generated around the current location based on the variance of the current location.
5. The method of claim 1, wherein determining a first probability that each of the location points is located in a respective lane based on the location of the location point, an error of the electronic map, and a location range of the respective lane obtained from the map data comprises:
determining the transverse position range of the position point on the electronic map according to the error of the electronic map and the position of the position point;
and determining the first probability of each position point being located in each lane in the electronic map according to the transverse position range and the position range of each lane.
6. The method of claim 1, wherein the obtaining of the line types of the lane lines on both sides and the distances to the lane lines on both sides of the lane where the vehicle is located, which are sensed by at least two sensing technologies, and the matching of the line types of the lane lines on both sides and the distances to the lane lines on both sides of the lane where each of the position points is located, which are obtained from the map data, and the determining of the second probability that each of the position points is located on each lane comprises:
respectively determining lane line types and distances from the lane lines on two sides of a lane where a vehicle is located according to the shot road image and the collected point cloud data;
matching the lane line types at the two sides of the lane where the vehicle is located and the distances from the lane line types at the two sides, which are determined according to the road image, with the lane line types at the two sides of the lane where each position point is located and the distances from the lane line types at the two sides, which are obtained from the map data, so as to obtain a third probability that each position point is located in each lane;
matching the lane line types at the two sides of the lane where the vehicle is located and the distances from the lane line types to the two sides, which are determined according to the point cloud data, with the lane line types at the two sides of the lane where each position point is located and the distances from the lane line types to the two sides, which are obtained from the map data, so as to obtain a fourth probability that each position point is located in each lane;
and performing fusion processing on the third probability and the fourth probability of each position point to obtain a second probability that each position point is located in each lane.
7. The method of claim 1, wherein the obtaining of the line types of the lane lines on both sides and the distances to the lane lines on both sides of the lane where the vehicle is located, which are sensed by at least two sensing technologies, and the matching of the line types of the lane lines on both sides and the distances to the lane lines on both sides of the lane where each of the position points is located, which are obtained from the map data, and the determining of the second probability that each of the position points is located on each lane comprises:
respectively determining the line type of the lane lines at two sides of the lane where the vehicle is located and the distance from the lane lines at two sides according to the shot road image and the collected point cloud data, and fusing the determined line type and distance;
and matching the line type of the lane lines at the two sides of the lane where the vehicle is located and the distance from the line type of the lane line at the two sides of the lane where the vehicle is located after fusion processing with the line type of the lane line at the two sides of the lane where each position point is located and the distance from the line type of the lane line at the two sides of the lane where each position point is located, which are obtained from map data, to obtain a second probability that each position point is located in each lane.
8. The method according to claim 7, wherein the matching the line types of the lane lines on the two sides and the distances to the lane lines on the two sides of the lane where the vehicle is located after the fusion processing with the line types of the lane lines on the two sides and the distances to the lane lines on the two sides of the lane where each position point is located, which are obtained from the map data, to obtain the second probability that each position point is located in each lane comprises:
obtaining the line type probability of the position point in each lane according to the line type of the lane lines at the two sides of the lane where the position point is located, which is obtained from the map data, and the probability that the lane lines at the two sides of the lane where the vehicle is located after fusion processing are the line type;
obtaining the distance probability of the position point in each lane by adopting a selected probability estimation algorithm according to the difference between the distance from the position point to the lane lines on the two sides and the distance from the position point to the lane line on the corresponding side after fusion processing, which are obtained from map data;
and respectively fusing the linear probability and the distance probability of each position point to obtain a second probability that each position point is positioned on each lane.
9. A lane-level locating device, comprising:
the current position acquisition module is used for positioning the current position of the vehicle;
a location point generating module for randomly generating a predetermined number of location points around the current location;
the first probability determination module is used for determining a first probability that each position point is located in each lane according to the position of the position point, the error of the electronic map and the position range of each lane obtained from map data;
the second probability determination module is used for acquiring the lane line types at two sides of the lane where the vehicle is located and the distances from the lane line types at two sides, which are sensed by at least two sensing technologies, matching the lane line types at two sides of the lane where each position point is located and the distances from the lane line types at two sides, which are acquired from the map data, and determining the second probability that each position point is located in each lane;
and the lane determining module is used for determining the final probability of each position point in each lane according to the first probability and the second probability, and determining the lane with the maximum final probability as the lane in which the vehicle is located.
10. A vehicle positioning system, comprising: the system comprises a server, a vehicle-mounted positioning terminal and a sensing terminal;
the perception terminal is used for perceiving the line type of the lane lines on the two sides of the lane where the vehicle is located and the distance from the lane lines on the two sides;
the vehicle-mounted positioning terminal is used for positioning the current position of the vehicle;
the server comprises a lane-level locating device according to claim 9.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a lane-level locating method according to any one of claims 1 to 8.
12. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program implementing a lane-level localization method according to any one of claims 1-8.
CN202010161665.6A 2020-03-10 2020-03-10 Lane level positioning method, device and system and related equipment Pending CN113375679A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010161665.6A CN113375679A (en) 2020-03-10 2020-03-10 Lane level positioning method, device and system and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010161665.6A CN113375679A (en) 2020-03-10 2020-03-10 Lane level positioning method, device and system and related equipment

Publications (1)

Publication Number Publication Date
CN113375679A true CN113375679A (en) 2021-09-10

Family

ID=77568686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010161665.6A Pending CN113375679A (en) 2020-03-10 2020-03-10 Lane level positioning method, device and system and related equipment

Country Status (1)

Country Link
CN (1) CN113375679A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113819918A (en) * 2021-09-17 2021-12-21 中国第一汽车股份有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113916242A (en) * 2021-12-14 2022-01-11 腾讯科技(深圳)有限公司 Lane positioning method and device, storage medium and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113819918A (en) * 2021-09-17 2021-12-21 中国第一汽车股份有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113819918B (en) * 2021-09-17 2023-05-16 中国第一汽车股份有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113916242A (en) * 2021-12-14 2022-01-11 腾讯科技(深圳)有限公司 Lane positioning method and device, storage medium and electronic equipment
CN113916242B (en) * 2021-12-14 2022-03-25 腾讯科技(深圳)有限公司 Lane positioning method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US11530924B2 (en) Apparatus and method for updating high definition map for autonomous driving
EP3637371B1 (en) Map data correcting method and device
CN111721285B (en) Apparatus and method for estimating location in automatic valet parking system
US10788830B2 (en) Systems and methods for determining a vehicle position
US10267640B2 (en) Vehicle position estimation device, vehicle position estimation method
JP2022113746A (en) Determination device
CN103270543B (en) Driving assist device
Matthaei et al. Map-relative localization in lane-level maps for ADAS and autonomous driving
US20210207977A1 (en) Vehicle position estimation device, vehicle position estimation method, and computer-readable recording medium for storing computer program programmed to perform said method
JP6881464B2 (en) Self-position estimation method and self-position estimation device
CN110795984A (en) Information processing method, information processing apparatus, and program recording medium
JP2001331787A (en) Road shape estimating device
KR102565482B1 (en) Apparatus for determining position of vehicle and method thereof
CN112805766B (en) Apparatus and method for updating detailed map
CN110608746A (en) Method and device for determining the position of a motor vehicle
JPWO2018212292A1 (en) Information processing apparatus, control method, program, and storage medium
CN112797998A (en) Vehicle lane level positioning method, corresponding program carrier, product, device and vehicle
Suganuma et al. Localization for autonomous vehicle on urban roads
CN113375679A (en) Lane level positioning method, device and system and related equipment
CN114694111A (en) Vehicle positioning
CN113795726B (en) Self-position correction method and self-position correction device
CN113544034A (en) Device and method for acquiring correction information of vehicle sensor
US11920936B2 (en) Vehicle controller, and method and computer program for controlling vehicle
RU2781373C1 (en) Method for correcting one&#39;s location and a device for correcting one&#39;s location
WO2023017624A1 (en) Drive device, vehicle, and method for automated driving and/or assisted driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination