CN114323050A - Vehicle positioning method and device and electronic equipment - Google Patents

Vehicle positioning method and device and electronic equipment Download PDF

Info

Publication number
CN114323050A
CN114323050A CN202210012621.6A CN202210012621A CN114323050A CN 114323050 A CN114323050 A CN 114323050A CN 202210012621 A CN202210012621 A CN 202210012621A CN 114323050 A CN114323050 A CN 114323050A
Authority
CN
China
Prior art keywords
lane line
target
matching
line matching
pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210012621.6A
Other languages
Chinese (zh)
Inventor
常海玥
韩志华
张旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhitu Technology Co Ltd
Original Assignee
Suzhou Zhitu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhitu Technology Co Ltd filed Critical Suzhou Zhitu Technology Co Ltd
Priority to CN202210012621.6A priority Critical patent/CN114323050A/en
Publication of CN114323050A publication Critical patent/CN114323050A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention provides a vehicle positioning method, a vehicle positioning device and electronic equipment, wherein at least one target lane line matching pair meeting preset conditions is screened out from a plurality of acquired first lane line matching pairs; acquiring target relative displacement between each target matching point pair in each target lane line matching pair; and inputting the relative displacement of each target, the obtained state transition equation and the initial parameters into a Kalman filter to obtain a positioning result of the vehicle. According to the method, the obtained multiple first lane line matching pairs are preprocessed to obtain the target lane line matching pairs, and the target lane line matching meets the preset condition that the visual perception lane lines and the map lane lines are not crossed and/or the relative displacement is smaller than the preset distance, so that the information utilization rate and the accuracy of the target lane lines can be guaranteed, the vehicle positioning result is corrected according to the relative displacement between each target matching point pair in the target lane line matching pairs, and the precision and the accuracy of the positioning result can be improved.

Description

Vehicle positioning method and device and electronic equipment
Technical Field
The invention relates to the technical field of automatic driving, in particular to a vehicle positioning method and device and electronic equipment.
Background
In an autopilot system, the accuracy of real-time positioning directly affects the implementation of accurate navigation. If the positioning error reaches the centimeter magnitude, the intelligent automobile can accurately judge the surrounding environment and the position of the map where the intelligent automobile is located, and particularly after high-precision map data are accessed, the navigation precision of the intelligent automobile can be butted to a certain position of a certain lane line, so that the automatic driving result is greatly improved. In the related art, a GNSS (Global Navigation Satellite System) and an IMU (Inertial Navigation Unit) are generally selected to be fused to obtain a more accurate position, but when a GNSS signal is poor or lost, a more stable effect cannot be guaranteed by positioning obtained by Inertial Navigation calculation alone, and errors are often accumulated and dispersed over time, resulting in poor positioning accuracy and accuracy.
Disclosure of Invention
The invention aims to provide a vehicle positioning method, a vehicle positioning device and electronic equipment, so as to relieve the problem of improving the vehicle positioning precision and accuracy.
The invention provides a vehicle positioning method, which comprises the following steps: acquiring a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, and a state transition equation and initial parameters of an extended Kalman filter; screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair is not crossed with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance; acquiring target relative displacement between each target matching point pair in each target lane line matching pair; and inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle.
Further, the step of screening out at least one target lane line matching pair satisfying a preset condition from the plurality of first lane line matching pairs includes: deleting the specified lane line matching pairs from the plurality of first lane line matching pairs to obtain at least one target lane line matching pair meeting a preset condition; wherein, appointing lane line matching pair includes: under the scene of the viaduct section, the visual perception lane lines and the map lane lines are crossed with each other.
Further, the step of screening out at least one target lane line matching pair satisfying a preset condition from the plurality of first lane line matching pairs includes: aiming at each first lane line matching pair, acquiring a plurality of first matching point pairs in the current first lane line matching pair; calculating an average relative displacement of a plurality of first relative displacements according to the first relative displacement between two matching points in each first matching point pair; if the average relative displacement is larger than a first preset distance, deleting the current first lane line matching pair; and taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
Further, the step of screening out at least one target lane line matching pair satisfying a preset condition from the plurality of first lane line matching pairs includes: aiming at each first lane line matching pair, acquiring a plurality of first matching point pairs in the current first lane line matching pair; calculating a second relative displacement between two matching points in each first matching point pair; selecting a plurality of designated matching point pairs with second relative displacement smaller than a second preset distance from the plurality of first matching point pairs; determining the lane line matching sections corresponding to the plurality of designated matching point pairs as target lane line matching pairs which are corresponding to the current first lane line matching and meet preset conditions; and taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
Further, before the step of determining the lane line matching segments corresponding to the plurality of designated matching point pairs as the target lane line matching pairs corresponding to the current first lane line matching and meeting the preset condition, the method further includes: and if the plurality of designated matching point pairs are a plurality of continuous matching point pairs, generating lane line matching sections corresponding to the plurality of designated matching point pairs based on the plurality of continuous matching point pairs.
Further, the step of inputting the target relative displacement, the state transition equation and the initial parameter between each target matching point pair into the kalman filter to obtain the positioning result of the vehicle includes: inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter, and outputting a first Jacobian matrix corresponding to the state transition equation; outputting a positioning predicted value and an error covariance predicted value of the vehicle based on a state transition equation and a first Jacobian matrix; and determining a positioning result of the vehicle based on the positioning predicted value and the error covariance predicted value.
Further, the step of determining the positioning result of the vehicle based on the positioning prediction value and the error covariance prediction value includes: acquiring an observation equation of the extended Kalman filter; determining a second Jacobian matrix corresponding to the observation equation based on the observation equation of the extended Kalman filter; determining updated Kalman gain based on the observation equation, the second Jacobian matrix and the error covariance predicted value; and updating the positioning predicted value based on the positioning predicted value, the updated Kalman gain and the observation equation to obtain a positioning estimated value of the vehicle at the current moment, and determining the positioning estimated value as a positioning result of the vehicle.
The invention provides a vehicle positioning device, comprising: the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, a state transition equation of an extended Kalman filter and initial parameters; the screening module is used for screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair is not crossed with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance; the second acquisition module is used for acquiring the target relative displacement between each target matching point pair in each target lane line matching pair; and the third acquisition module is used for inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into the Kalman filter to obtain the positioning result of the vehicle.
The invention provides an electronic device which comprises a processor and a memory, wherein the memory stores machine executable instructions capable of being executed by the processor, and the processor executes the machine executable instructions to realize the vehicle positioning method of any one of the above items.
The present invention provides a machine-readable storage medium having stored thereon machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement a vehicle localization method as in any one of the above.
According to the vehicle positioning method, the vehicle positioning device and the electronic equipment, a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, a state transfer equation and initial parameters of an extended Kalman filter are obtained; screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair is not crossed with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance; acquiring target relative displacement between each target matching point pair in each target lane line matching pair; and inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle. According to the method, the obtained multiple first lane line matching pairs are preprocessed to obtain the target lane line matching pairs, and the target lane line matching meets the preset condition that the visual perception lane lines and the map lane lines are not crossed and/or the relative displacement is smaller than the preset distance, so that the information utilization rate and the accuracy of the target lane lines can be guaranteed, the vehicle positioning result is corrected according to the relative displacement between each target matching point pair in the target lane line matching pairs, and the precision and the accuracy of the positioning result can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a vehicle positioning method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another vehicle locating method provided by an embodiment of the present invention;
fig. 3 is a schematic diagram of a lane line matching pair according to an embodiment of the present invention;
FIG. 4 is a flow chart of another vehicle locating method provided by the embodiments of the present invention;
fig. 5 is a schematic diagram of another lane line matching pair provided in the embodiment of the present invention;
FIG. 6 is a flow chart of another vehicle locating method provided by an embodiment of the present invention;
fig. 7 is a schematic diagram of another lane line matching pair provided in the embodiment of the present invention;
FIG. 8 is a block diagram of a positioning system according to an embodiment of the present invention;
FIG. 9 is a schematic view of a vehicle positioning method according to an embodiment of the present invention;
FIG. 10 is a schematic structural diagram of a vehicle positioning device according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the following embodiments, and it should be understood that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In an autopilot system, the accuracy of real-time positioning directly affects the implementation of accurate navigation. If the positioning error reaches the centimeter magnitude, the intelligent automobile can accurately judge the surrounding environment and the position of the map where the intelligent automobile is located, and particularly after high-precision map data are accessed, the navigation precision of the intelligent automobile can be butted to a certain position of a certain lane line, so that the automatic driving result is greatly improved.
The real-time positioning technology at the present stage is applied to the fields of automatic driving, auxiliary driving, unmanned vehicles, unmanned planes, robots, map acquisition vehicles, high-speed rail vehicles, ships, aviation and the like. Taking automatic driving as an example, the positioning effect can be more ideal by combining the fusion of various sensors, such as laser radar, camera, ultrasonic radar, millimeter wave radar, RTK (Real-time kinematic), IMU inertial sensor, and the like. However, each sensor has its own advantages and disadvantages, and it is difficult to find an excellent fusion scheme in practical application, so that the positioning technology has not broken through its technical bottleneck.
Although the navigation System relies on a GPS (Global Positioning System) signal to realize that the vehicle starts from its own position to the destination, the accuracy error and the ideal value are within a range of 0-10 m, and the satellite Positioning accuracy is improved to about 5m as the available satellites gradually increase, however, such accuracy also presents a significant disadvantage along with the development of automatic driving, for example, the width of one lane line is about 3.8 m, the error of satellite Positioning is far from being able to accurately position the vehicle on a certain lane line, the accuracy of about 5m is likely to cause wrong lane Positioning, wrong lane Positioning may occur in the face of a complicated intersection or a vehicle turning process, and it is known that Positioning is the basis of automatic driving and navigation, and the Positioning error is too large to cause the Positioning of the vehicle to be unable to reach the expectation. In the industry, a GNSS and an IMU are selected to be fused, so that a more accurate position is obtained, but under the condition that GNSS signals are poor or lost, the positioning obtained by calculation by the IMU alone cannot ensure a more stable effect, and errors are accumulated and dispersed along with the lapse of time, so that the positioning result is not available.
The intelligent vehicle body often cannot utilize the IMU as a unique navigation sensor, and needs to be matched with a chassis, a vision sensor, a radar and the like for use. Radars are further classified into laser radars, millimeter wave radars, ultrasonic radars, and the like. The laser radar has the advantages that the measurement accuracy and the detection distance are guaranteed, but the laser radar is limited by a movable shielding object, semantic information such as colors on a plane is lost, the cost is high, and the service life is short; the detectable angle of the millimeter wave radar is limited, and the accurate positioning can be realized only by a plurality of sensors; the detection distance of the ultrasonic radar is too small, and the precision is lower; the nonreactive factors needing to be eliminated by using radar positioning are many and complicated, the cost problem is also considered, and the method has no major breakthrough in the positioning field.
The vision-based positioning is mostly based on two-dimensional images, and when the two-dimensional images are converted into a three-dimensional world, more or less information loss and calculation errors are inevitable, and even vision deception is generated under the environments of special illumination and the like. Because the depth information is lost by using the visual positioning, the automobile can have poor transverse and longitudinal positioning when running on the road.
The situations are generally in the field of automatic driving of passenger vehicles, and for commercial vehicles, the positioning difficulty of the commercial truck is improved due to the influences of factors such as high visual angle, heavy tonnage, wide wheel track, long wheel base, presence or absence of rear towing and the like.
When the commercial vehicle runs at a high speed in a straight line, the sensors are not well fused, so that the following three problems exist in actual positioning: (1) when GNSS signals are not good or disappear, for example, in areas where rainy weather signals are poor, the vehicle is driven in a tunnel, below an overpass and the like, and the upper part is blocked, the lateral positioning error of the commercial vehicle is accumulated and dispersed, and the positioning accuracy is reduced or even the vehicle cannot be positioned. (2) When the GNSS signal is recovered from the sudden disappearance, the positioning accuracy may need a period of time to be corrected to the state before the signal disappears. (3) When the GNSS signals are from none to some time, the positioning result generates transverse large jump and then is gradually repaired. Based on this, the embodiment of the invention provides a vehicle positioning method, a vehicle positioning device and electronic equipment, and the technology can be applied to a scene in which a vehicle needs to be positioned in automatic driving.
In order to facilitate understanding of the embodiment, a vehicle positioning method disclosed in the embodiment of the invention is first described in detail; as shown in fig. 1, the method comprises the steps of:
step S102, obtaining a plurality of first lane line matching pairs corresponding to the visual perception lane lines and the map lane lines which are matched with each other, and a state transition equation and initial parameters of the extended Kalman filter.
The visual perception lane line can be acquired by the intelligent camera; the map lane line may be a lane line within a preset range acquired from a map based on a current position of the vehicle; the perceived visual perception lane lines can be matched with a plurality of map lane lines in a local map through related technologies such as a KM algorithm (an algorithm for finding the optimal matching of weighted bipartite maps) to obtain first lane line matching pairs; the extended Kalman filter is an extended form of standard Kalman filtering under a nonlinear condition, is a high-efficiency recursive filter, and has the basic idea that a nonlinear system is linearized by Taylor series expansion, and then a Kalman filtering frame is adopted to filter signals; the above state transition equation may be sk=f(sk-1)+ukDenotes the transition of the vehicle from the state at time k-1 to the state at time k, where sk=[xk yk zk θ φ γ]Denotes the state vector of the vehicle at time k, xk、yk、zkTheta, phi and gamma represent the pose of the vehicle in the northeast coordinate system, corresponding to the east, north and sky directions, respectively, and corresponding to the roll, pitch and yaw angles (Euler angles), s, of the vehicle in the northeast directionk-1Represents the state vector of the vehicle at the time k-1, ukN (Q, Q) is the state noiseQ and Q represent distribution parameters of noise, which are expectation and variance of gaussian distribution, respectively; according to Taylor expansion
Figure BDA0003459564220000081
To obtain sk=f(s'k-1)+Fk-1(sk-1-<s'k-1>)+uk. The initial parameters typically include an initial state vector s of the vehicle0And an initial error covariance matrix P0(ii) a In practical implementation, when the precise positioning of the vehicle needs to be determined, the state transition equation and the initial parameters of the first lane line matching pair and the extended kalman filter are generally acquired first.
S104, screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane lines in the target lane line matching pair are not crossed with the map lane lines, and/or the relative displacement between the visual perception lane lines in the target lane line matching pair and the map lane lines is smaller than the preset distance.
The preset distance can be set according to actual requirements; in practical implementation, the obtained first lane line matching pair may have a mismatch or poor match, so that a plurality of first lane line matching pairs are usually required to be preprocessed to screen out one or more target lane line matching pairs meeting preset conditions; for example, for a road section with an overpass, since a high-precision map cannot be layered in the aerial view in the elevation direction, a situation of cross matching between a visual perception lane line and a map lane line in the horizontal and vertical directions may occur, and the situation belongs to wrong matching and usually needs to be deleted; it may also happen that the matching between the visually perceived lane line and the map lane line is correct, but when the vehicle shakes, brakes suddenly, turns or decelerates, the slope or line type of the visually perceived lane line may be inaccurate, and the like, and at this time, the first lane line matching pair in such a scene needs to be segmented to intercept a segment or a point set meeting the preset condition.
And step S106, acquiring the target relative displacement between each target matching point pair in each target lane line matching pair.
And after one or more target lane line matching pairs are obtained, each point in the visual perception lane line and a matching point corresponding to the point on the map lane line are obtained from each target lane line matching pair, and the target relative displacement between the two matched points is calculated.
And S108, inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle.
Taking the obtained target relative displacement between each target matching point pair as an observed value, a state value corresponding to the state transition equation, and an initial s set when k is 00And P0And inputting the vehicle positioning data into a Kalman filter, and obtaining a positioning result of the vehicle through Kalman filter processing.
The vehicle positioning method comprises the steps of obtaining a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, and a state transition equation and initial parameters of an extended Kalman filter; screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair is not crossed with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance; acquiring target relative displacement between each target matching point pair in each target lane line matching pair; and inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle. According to the method, the obtained multiple first lane line matching pairs are preprocessed to obtain the target lane line matching pairs, and the target lane line matching meets the preset condition that the visual perception lane lines and the map lane lines are not crossed and/or the relative displacement is smaller than the preset distance, so that the information utilization rate and the accuracy of the target lane lines can be guaranteed, the vehicle positioning result is corrected according to the relative displacement between each target matching point pair in the target lane line matching pairs, and the precision and the accuracy of the positioning result can be improved.
The embodiment of the invention also provides another vehicle positioning method, which is realized on the basis of the method of the embodiment; as shown in fig. 2, the method comprises the steps of:
step S202, obtaining a plurality of first lane line matching pairs corresponding to the visual perception lane lines and the map lane lines which are matched with each other, and a state transition equation and initial parameters of the extended Kalman filter.
Step S204, deleting the specified lane line matching pairs from the plurality of first lane line matching pairs to obtain at least one target lane line matching pair meeting preset conditions; wherein, appointing lane line matching pair includes: under the scene of the viaduct section, the visual perception lane lines and the map lane lines are crossed with each other.
Referring to fig. 3, a schematic diagram of a lane line matching pair is shown, in which a thin solid line is a visual perception lane line, and a thick solid line is a map lane line; under the condition that a viaduct road section scene exists or a scene is shielded by other shielding objects, because the high-precision map cannot be layered in the height direction in the aerial view, namely, the local map taken in the high-precision map has no height information, whether the vehicle is in an upper lane or a lower lane of the viaduct cannot be judged in a layered manner temporarily, the condition that the lane line is visually perceived to be transversely and longitudinally crossed and matched with the lane line of the map possibly occurs, and the wrong matching needs to be removed according to the constraint of the extending direction of the lane line.
Step S206, acquiring the target relative displacement between each target matching point pair in each target lane line matching pair.
And S208, inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle.
The vehicle positioning method obtains a plurality of first lane line matching pairs corresponding to the visual perception lane lines and the map lane lines which are matched with each other, and a state transition equation and initial parameters of the extended Kalman filter. Deleting the specified lane line matching pairs from the plurality of first lane line matching pairs to obtain at least one target lane line matching pair meeting a preset condition; wherein, appointing lane line matching pair includes: under the scene of the viaduct section, the visual perception lane lines and the map lane lines are crossed with each other. And acquiring the target relative displacement between each target matching point pair in each target lane line matching pair. And inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle. According to the method, the obtained multiple first lane line matching pairs are preprocessed to obtain the target lane line matching pairs, and the target lane line matching meets the preset condition that the visual perception lane lines and the map lane lines are not crossed and/or the distance is smaller than the preset distance, so that the information utilization rate and the accuracy of the target lane lines can be ensured, the vehicle positioning result is corrected according to the relative displacement between each target matching point pair in the target lane line matching pairs, and the precision and the accuracy of the positioning result can be improved.
The embodiment of the invention also provides another vehicle positioning method, which is realized on the basis of the method of the embodiment; as shown in fig. 4, the method includes the steps of:
step S402, obtaining a plurality of first lane line matching pairs corresponding to the visual perception lane lines and the map lane lines which are matched with each other, and a state transition equation and initial parameters of the extended Kalman filter.
Step S404, aiming at each first track line matching pair, obtaining a plurality of first matching point pairs in the current first track line matching pair.
In actual implementation, the visual perception lane line in the current first lane line matching pair is composed of a plurality of points, the map lane line is also composed of a plurality of points, and a plurality of first matching point pairs corresponding to the visual perception lane line and the map lane line in the current first lane line matching pair are obtained.
Step S406, calculating an average relative displacement of the plurality of first relative displacements according to the first relative displacement between two matching points in each first matching point pair.
And respectively calculating first relative displacement between each point of the visual perception lane line in the current first lane line matching pair and the point corresponding to the map lane line, and averaging the obtained plurality of first relative displacements to obtain the average relative displacement of the plurality of first relative displacements.
And step S408, if the average relative displacement is larger than the first preset distance, deleting the current first lane line matching pair.
The first preset distance may be set according to actual requirements, for example, the first preset distance may be 0.5 lane width, and the like; after the average relative displacement is obtained, the average relative displacement may be compared with a first preset distance, if the average relative displacement is less than or equal to the first preset distance, the current first lane line matching pair may be considered as an available lane line matching pair, and if the average relative displacement is greater than the first preset distance, the current first lane line matching pair may be considered as a wrong matching and needs to be deleted. As shown in fig. 5, another schematic diagram of lane line matching pairs may be obtained by deleting the lane line missing matching that occurs due to the inaccuracy of the lane line of the high-precision map, using the constraint condition that the average error of the matching pair point set is not greater than 0.5 lane width.
And step S410, taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
And preprocessing each first lane line matching pair according to the method to finally obtain one or more target lane line matching pairs meeting preset conditions.
Step S412, a target relative displacement between each target matching point pair in each target lane line matching pair is obtained.
And step S414, inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain the positioning result of the vehicle.
The vehicle positioning method obtains a plurality of first lane line matching pairs corresponding to the visual perception lane lines and the map lane lines which are matched with each other, and a state transition equation and initial parameters of the extended Kalman filter. And aiming at each first track line matching pair, acquiring a plurality of first matching point pairs in the current first track line matching pair. And calculating the average relative displacement of the plurality of first relative displacements according to the first relative displacement between the two matching points in each first matching point pair. And if the average relative displacement is larger than the first preset distance, deleting the current first lane line matching pair. And taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions. And acquiring the target relative displacement between each target matching point pair in each target lane line matching pair. And inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle. According to the method, the obtained multiple first lane line matching pairs are preprocessed to obtain the target lane line matching pairs, and the target lane line matching meets the preset condition that the visual perception lane lines and the map lane lines are not crossed and/or the distance is smaller than the preset distance, so that the information utilization rate and the accuracy of the target lane lines can be ensured, the vehicle positioning result is corrected according to the relative displacement between each target matching point pair in the target lane line matching pairs, and the precision and the accuracy of the positioning result can be improved.
The embodiment of the invention also provides another vehicle positioning method, which is realized on the basis of the method of the embodiment; as shown in fig. 6, the method includes the steps of:
step S602, obtaining a plurality of first lane line matching pairs corresponding to the mutually matched visual perception lane lines and map lane lines, a state transition equation and initial parameters of the extended Kalman filter.
Step S604, for each first lane line matching pair, obtaining a plurality of first matching point pairs in the current first lane line matching pair.
Step S606, a second relative displacement between two matching points in each first matching point pair is calculated.
Step S608, selecting a plurality of designated matching point pairs with a second relative displacement smaller than a second preset distance from the plurality of first matching point pairs.
The second preset distance may be set according to different road segments or different matching accuracies, for example, the second preset distance may be 1 meter; in practical implementation, the second relative displacement between each point of the visual perception lane line in the current first lane line matching pair and the point corresponding to the map lane line is calculated, for example, any one lane line point in the visual perception lane line and the map lane line in the current first lane line matching pair may be collected, a projection distance from each point to another lane line, that is, a vertical distance, is calculated, and if the projection distance is smaller than a second preset distance, the point and the corresponding matching point may be attributed to the designated matching point pair.
In step S610, if the plurality of designated matching point pairs are a continuous plurality of matching point pairs, lane line matching segments corresponding to the plurality of designated matching point pairs are generated based on the continuous plurality of matching point pairs.
When the designated matching point pairs are continuous matching point pairs, that is, when the second relative displacement between a plurality of continuous points and the corresponding matching points is smaller than the second preset distance, the designated matching point pairs can be linearly fitted to a plurality of continuous points on the visual perception lane line corresponding to the designated matching point pairs to form segments in the current first lane line matching pair, and the plurality of corresponding continuous points on the map lane line are linearly fitted to form segments to obtain lane line matching segments corresponding to the designated matching point pairs.
Step S612, determining the lane line matching segments corresponding to the multiple designated matching point pairs as a target lane line matching pair meeting preset conditions and corresponding to the current first lane line matching.
Step S614, taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
And preprocessing each first lane line matching pair according to the method to finally obtain one or more target lane line matching pairs meeting preset conditions.
In practical implementation, the poor matching caused by information reading errors caused by the vision sensor needs to be eliminated, specifically, the commercial vehicle is different from a passenger vehicle in that the load bearing weight, the vehicle body length and the like of the commercial vehicle can increase the characteristics of longitudinal shaking and transverse shaking of a vehicle body in the driving process, the influence of centripetal force is serious in turning, the selectable mounting position of the vision sensor is limited, the influence of internal parameters and external parameters of a camera is large in the process of sensing the visual lane line, and the original pixel data is obtained in the process of sensing the original pixel data
Figure BDA0003459564220000141
After removing distortion to obtain
Figure BDA0003459564220000142
Then use
Figure BDA0003459564220000143
Corresponding errors occur when the image is converted to a world coordinate system or other three-dimensional coordinate systems, wherein u and v are coordinate values of any pixel point in a pixel coordinate system in a two-dimensional image acquired by a visual sensor; u 'and v' are coordinate values of the pixel point in a pixel coordinate system after distortion is removed; xW、YWAnd ZWIs the coordinate value of the pixel point under the three-dimensional world coordinate system; r represents a rotation matrix that rotates points from the three-dimensional world coordinate system to the three-dimensional camera coordinate system; t represents a translation vector of a point rotating from a three-dimensional world coordinate system to a three-dimensional camera coordinate system; f. ofx、fy、u'0And v'0Is an internal parameter of the camera, wherein fx=f*sx,fy=f*syF is focal length, sxAnd syIs the inverse of the pixel cell size (unit: pixel/. mu.m). By passing
Figure BDA0003459564220000151
The three-dimensional position of the point in the camera coordinate system can be obtained
Figure BDA0003459564220000152
Colloquially, Zc may be understood as depth information of a phase plane point or distance from a camera.
The purpose of the conversion to the three-dimensional coordinate system is to match with a map lane line, and since the conversion between coordinate systems (R, T described above) is determined by external reference after calibration by a sensor (camera), and the calibration itself has errors, the matching precision of the lane line is affected certainly when the coordinate system conversion is made, so that the matching precision of the three-dimensional coordinate system and the map lane line based on the conversion is poor, which is specifically as follows:
referring to fig. 7, another schematic diagram of lane line matching pairs is shown, as shown in fig. 7(a), when the commercial vehicle body is in a high-speed straight-line driving process and has severe jitter or an emergency lane change, the slope of the lane line sensed by the camera (corresponding to the above-mentioned visually-sensed lane line) is different from that of the lane line of the high-precision map due to inertia. As shown in fig. 7(b), when emergency braking is performed in an emergency, if the lane line information obtained by semantic division by sensing exceeds the distortion removal range of the wide-angle camera, the lane line is distorted, and the lane line which should be sensed as a straight line is erroneously divided into curves. As shown in fig. 7(c), when the vehicle makes a deceleration movement while turning, the number of axles of the commercial vehicle is large, the length of the commercial vehicle is long, the vehicle head tilts upwards or lifts off the ground temporarily, the front-view camera is high in installation position, the length of the perceivable lane line is short, and the original curve can be perceived as a straight line. As shown in fig. 7(d), when the vehicle turns at a high speed, the head of the vehicle will shift in the opposite direction of the turning due to the centripetal force, so that the camera forms an immeasurable angle with the ground in the transverse direction, and the perceived lane line (corresponding to the visually perceived lane line) has a curvature different from that of the lane line on the high-precision map.
In the above four cases, although the matching pair is correct, the root of the problem lies in the perception of the visual sensor, and the matching result cannot be chosen in a cut-off manner, in which case, the lane line obtained by visual perception needs to be segmented, and the segment or point set meeting the requirement is obtained to participate in the fusion positioning.
Step S616, obtaining the target relative displacement between each target matching point pair in each target lane line matching pair.
It should be noted that, if the matching point pair is designated as a discrete matching point pair, the target relative displacement between two matching points in the discrete matching point pair can be directly obtained.
Step S618, inputting the target relative displacement, the state transition equation, and the initial parameter between each target matching point pair into the kalman filter, and obtaining a positioning result of the vehicle.
The vehicle positioning method obtains a plurality of first lane line matching pairs corresponding to the visual perception lane lines and the map lane lines which are matched with each other, and a state transition equation and initial parameters of the extended Kalman filter. And aiming at each first track line matching pair, acquiring a plurality of first matching point pairs in the current first track line matching pair. A second relative displacement between two matching points in each first matching point pair is calculated. And selecting a plurality of appointed matching point pairs with second relative displacement smaller than a second preset distance from the plurality of first matching point pairs. And if the plurality of designated matching point pairs are a plurality of continuous matching point pairs, generating lane line matching sections corresponding to the plurality of designated matching point pairs based on the plurality of continuous matching point pairs. And determining the lane line matching sections corresponding to the plurality of specified matching point pairs as target lane line matching pairs which are corresponding to the current first lane line matching and meet preset conditions. And taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions. And acquiring the target relative displacement between each target matching point pair in each target lane line matching pair. And inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle. According to the method, the obtained multiple first lane line matching pairs are preprocessed to obtain the target lane line matching pairs, and the target lane line matching meets the preset condition that the visual perception lane lines and the map lane lines are not crossed and/or the distance is smaller than the preset distance, so that the information utilization rate and the accuracy of the target lane lines can be ensured, the vehicle positioning result is corrected according to the relative displacement between each target matching point pair in the target lane line matching pairs, and the precision and the accuracy of the positioning result can be improved.
The embodiment of the invention also provides another vehicle positioning method, which is realized on the basis of the method of the embodiment; the method comprises the following steps:
the method comprises the steps of firstly, obtaining a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, and state transition equations and initial parameters of an extended Kalman filter.
Screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane lines in the target lane line matching pair are not crossed with the map lane lines, and/or the relative displacement between the visual perception lane lines in the target lane line matching pair and the map lane lines is smaller than the preset distance.
And step three, acquiring the target relative displacement between each target matching point pair in each target lane line matching pair.
And fourthly, inputting the target relative displacement, the state transition equation and the initial parameters among each target matching point pair into a Kalman filter, and outputting a first Jacobian matrix corresponding to the state transition equation.
In practical implementation, after the target relative displacement, the state transition equation and the initial parameter between each target matching point pair are input into the kalman filter, a first jacobian matrix corresponding to the state transition equation may be obtained, and specifically, with reference to the foregoing embodiment, after the taylor expansion is performed, the obtained state transition equation s is obtainedk=f(s'k-1)+Fk-1(sk-1-<s'k-1>)+ukWhich is at<s’k-1>The first Jacobian matrix of
Figure BDA0003459564220000171
And fifthly, outputting a positioning predicted value and an error covariance predicted value of the vehicle based on the state transition equation and the first Jacobian matrix.
In practical implementation, the state transition equation and the first jacobian matrix can be combined to obtain a predicted state of the system, namely a positioning predicted pose (corresponding to the positioning predicted value) s 'of the vehicle'k=Fk-1s'k-1(ii) a The corresponding error covariance prediction value is
Figure BDA0003459564220000181
Wherein, P'k-1Representing the predicted value of the error covariance matrix at time k-1.
And step six, determining a positioning result of the vehicle based on the positioning predicted value and the error covariance predicted value.
The sixth step can be specifically realized by the following steps a to D:
and A, acquiring an observation equation of the extended Kalman filter.
And B, determining a second Jacobian matrix corresponding to the observation equation based on the observation equation of the extended Kalman filter.
The state transition equation and the observation mode in this embodiment may be determined according to the update method of the system and the observation method of the sensor, and the observation equation may be zk=h(sk)+vkWherein h(s)k) Representing a process equation that maps the state vector to the control in which the measurement is located; v. ofkN (R, R) represents observation noise, and z is obtained from Taylor expansionk=h(s'k)+Hk(sk-s'k)+vkWherein h (s'k) Represents predicted pose s'kInputting a predicted observation value obtained by entering an observation equation; s'kThe second Jacobian matrix of
Figure BDA0003459564220000182
And step C, determining the updated Kalman gain based on the observation equation, the second Jacobian matrix and the error covariance predicted value.
And D, updating the positioning predicted value based on the positioning predicted value, the updated Kalman gain and the observation equation to obtain a positioning estimated value of the vehicle at the current moment, and determining the positioning estimated value as a positioning result of the vehicle.
After the positioning predicted value and the error covariance predicted value of the vehicle are obtained, the updated Kalman gain can be obtained as
Figure BDA0003459564220000183
According to the updated Kalman gain, the positioning predicted value and the observation equation, the positioning estimated value < s 'of the vehicle at the current moment can be obtained'k>=s'k+Kk(zk-h(s'k) Substituting the corresponding line pair of the visual perception and the high-precision map lane line matching pair or the relative displacement between the corresponding point pair into calculation. Finally, the error covariance matrix is updated to be Pk=P'k-KkHkP'k
According to the vehicle positioning method, the removed wrong lane line matching result is added into a filter of the positioning system to obtain a visual fusion positioning result, so that the lateral deviation of lane line level positioning in the vehicle driving process is ensured to be within an acceptable range.
For further understanding of the above embodiments, referring to a schematic diagram of a positioning system framework shown in fig. 8, a sensor in automatic driving is divided into a sensor combining GNSS and IMU, radar, chassis, and visual perception information, and lane line feature matching can be performed based on the visual perception information and high-precision map information for matching fusion, and a positioning result of a vehicle is obtained by combining the GNSS and IMU fusion, radar fusion, and chassis fusion. During concrete implementation, the overall fusion technical scheme needs to be firstly confirmed, and the fusion scheme is determined according to different types, different quantities, different precisions and different working modes of the sensors installed on the commercial vehicle:
the distance confidence between the sensor with higher precision and the wall surface or the road edge around the vehicle can be properly improved if the radar distance observation precision is higher, the longitudinal pose of the chassis wheel can be better corrected, and the transverse error correction can be greatly performed by matching the semantic segmentation result obtained by visual perception with a high-precision map.
Multiple or multiple sensors run simultaneously, the compatibility problem needs to be clear according to the sequence of data acquisition time of each sensor, the frequency of data acquisition cannot be unified, the data calculation modes and the complexity are different, and a foundation is laid for the subsequent positioning result output of fusion vision and high-precision map lane line matching. The compatibility issues include: whether the data acquired by the sensors are processed by an internal system or not and whether the time sequence is reasonable or not, whether the sensors repel each other or not, for example, visual perception tells that the vehicle is deviated to the right side of a lane line at the moment, the differential model of the wheel speed positions the vehicle at the position of the lane line deviated to the left, and observation noise can be adjusted after the respective accuracy of the two sensors is measured, so that the problem of repulsion is solved. And the observation noise of the observation equation in the Kalman filtering is reasonably adjusted, so that the reliability of the data obtained by each sensor can be coordinated.
Referring to a schematic diagram of a vehicle positioning method shown in fig. 9, a plurality of lane line matching pairs (corresponding to the first lane line matching pair) obtained after visual perception and high-precision map lane line matching are preprocessed to obtain available matching pairs (corresponding to the target lane line matching pair), relative displacement between two corresponding matching point pairs in each available matching pair is input to a kalman filter as an observed value, and state noise corresponds to the u-th orderkThe positioning system state corresponds to the above f(s)k-1) Combining the state noise and the positioning system state to obtain a state transition equation sk=f(sk-1)+ukS of thekAs a state value, the state value is input to the kalman filter, and the initialization s when K is set to 0 is set0Sum error covariance matrix P0Inputting the data into a Kalman filter, and processing the input data through the Kalman filterThen, a first Jacobian matrix corresponding to the state transition equation can be output, and the positioning prediction pose s 'of the vehicle can be obtained through the prediction module'k=Fk-1s'k-1And the error covariance prediction value is
Figure BDA0003459564220000201
The state observer corresponds to the above h(s)k) Observed noise is vkN (R, R), the observation equation z is obtained after the state observer is combined with the observation noisek=h(sk)+vkInputting the observation equation into an optimal state estimator, determining a second Jacobian matrix corresponding to the observation equation, updating the positioning prediction pose and the error covariance prediction value based on the positioning prediction pose, the observation equation, the second Jacobian matrix and the error covariance prediction value, and obtaining updated Kalman gain and vision fusion positioning result < s'k>=s'k+Kk(zk-h(s'k) And the updated error covariance matrix is P)kAnd inputting the obtained result into the positioning system state again to improve the vehicle positioning at the next moment, wherein the visual fusion positioning result is less than s'k>=s'k+Kk(zk-h(s'k) As an output, an optimal estimation of the vehicle positioning system state is achieved, corresponding to the positioning result of the vehicle at the current moment.
The vehicle positioning method aims at improving the transverse positioning when the satellite signal is weak or not, and the positioning calculated by IMU or wheel speed is laterally inaccurate or diverged, and the relative displacement between two matching points in a matching point pair can be used for improving the transverse positioning, namely, the intelligent vehicle can know how far the transverse difference between the self-thought positioning and the real positioning is obtained according to the distance between the self-perceived visual perception lane line and the high-precision map lane line so as to correct the self-thought positioning.
Compared with the fusion of a classical single GNSS and an IMU, the vehicle positioning mode has the advantage that the positioning effect is improved after the visual perception and the high-precision map matching fusion are added. When GNSS signals are weak or the signals disappear, the vision perception is realized by high-precision map fusion, so that the positioning errors are not accumulated and are not dispersed. When the GNSS signal changes suddenly, the positioning effect is better maintained by adding visual perception to high-precision map fusion, so that the GNSS signal is in a relatively stable positioning state. The invention can fully utilize various knowledge of the sensing system and the high-precision map, increases the utilization rate of the sensing sensor and information, improves the transverse positioning precision of the commercial vehicle in the high-speed straight-line driving process, enhances the environment sensing capability of intelligent driving and improves the self-positioning effect. Visual perception and high-precision map matching fusion are added into a GNSS and IMU fusion system, so that under the conditions that GNSS signals are weak or not and an IMU gyroscope has errors, the transverse errors of a commercial card concentrator during high-speed straight-line driving are not accumulated and not dispersed, and the errors can be quickly corrected during GNSS signal recovery.
An embodiment of the present invention provides a vehicle positioning apparatus, as shown in fig. 10, the apparatus including: a first obtaining module 100, configured to obtain a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, a state transition equation of the extended kalman filter, and an initial parameter; the screening module 101 is configured to screen at least one target lane line matching pair that meets a preset condition from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair is not crossed with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance; a second obtaining module 102, configured to obtain a target relative displacement between each target matching point pair in each target lane line matching pair; and the third obtaining module 103 is configured to input the target relative displacement, the state transition equation, and the initial parameter between each target matching point pair into the kalman filter, so as to obtain a positioning result of the vehicle.
The vehicle positioning device acquires a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, a state transition equation and initial parameters of an extended Kalman filter; screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair is not crossed with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance; acquiring target relative displacement between each target matching point pair in each target lane line matching pair; and inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter to obtain a positioning result of the vehicle. The device obtains the target lane line matching pairs by preprocessing the obtained first lane line matching pairs, and the target lane line matching satisfies the preset conditions that the visual perception lane lines are not crossed with the map lane lines and/or the relative displacement is smaller than the preset distance, so that the information utilization rate and the accuracy of the target lane lines can be ensured, the vehicle positioning result is corrected according to the relative displacement between each target matching point pair in the target lane line matching pairs, and the precision and the accuracy of the positioning result can be improved.
Further, the screening module is further configured to: deleting the specified lane line matching pairs from the plurality of first lane line matching pairs to obtain at least one target lane line matching pair meeting a preset condition; wherein, appointing lane line matching pair includes: under the scene of the viaduct section, the visual perception lane lines and the map lane lines are crossed with each other.
Further, the screening module is further configured to: aiming at each first lane line matching pair, acquiring a plurality of first matching point pairs in the current first lane line matching pair; calculating an average relative displacement of the plurality of first relative displacements according to the first relative displacement between two matching points in each first matching point pair; if the average relative displacement is larger than a first preset distance, deleting the current first lane line matching pair; and taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
Further, the screening module is further configured to: aiming at each first lane line matching pair, acquiring a plurality of first matching point pairs in the current first lane line matching pair; calculating a second relative displacement between two matching points in each first matching point pair; selecting a plurality of designated matching point pairs with second relative displacement smaller than a second preset distance from the plurality of first matching point pairs; determining the lane line matching sections corresponding to the plurality of designated matching point pairs as target lane line matching pairs which are corresponding to the current first lane line matching and meet preset conditions; and taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
Further, the screening module is further configured to: and if the plurality of designated matching point pairs are a plurality of continuous matching point pairs, generating lane line matching sections corresponding to the plurality of designated matching point pairs based on the plurality of continuous matching point pairs.
Further, the third obtaining module is further configured to: inputting the target relative displacement, the state transition equation and the initial parameters between each target matching point pair into a Kalman filter, and outputting a first Jacobian matrix corresponding to the state transition equation; outputting a positioning predicted value and an error covariance predicted value of the vehicle based on a state transition equation and a first Jacobian matrix; and determining a positioning result of the vehicle based on the positioning predicted value and the error covariance predicted value.
Further, the third obtaining module is further configured to: acquiring an observation equation of the extended Kalman filter; determining a second Jacobian matrix corresponding to the observation equation based on the observation equation of the extended Kalman filter; determining updated Kalman gain based on the observation equation, the second Jacobian matrix and the error covariance predicted value; and updating the positioning predicted value based on the positioning predicted value, the updated Kalman gain and the observation equation to obtain a positioning estimated value of the vehicle at the current moment, and determining the positioning estimated value as a positioning result of the vehicle.
The implementation principle and the generated technical effects of the vehicle positioning device provided by the embodiment of the invention are the same as those of the vehicle positioning method embodiment, and for brief description, reference may be made to corresponding contents in the vehicle positioning method embodiment for the part where the embodiment of the vehicle positioning device is not mentioned.
An embodiment of the present invention further provides an electronic device, as shown in fig. 11, the electronic device includes a processor 130 and a memory 131, the memory 131 stores machine executable instructions capable of being executed by the processor 130, and the processor 130 executes the machine executable instructions to implement the vehicle positioning method.
Further, the electronic device shown in fig. 11 further includes a bus 132 and a communication interface 133, and the processor 130, the communication interface 133, and the memory 131 are connected through the bus 132.
The Memory 131 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 133 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used. The bus 132 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 11, but that does not indicate only one bus or one type of bus.
The processor 130 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 130. The Processor 130 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 131, and the processor 130 reads the information in the memory 131 and completes the steps of the method of the foregoing embodiment in combination with the hardware thereof.
The embodiment of the present invention further provides a machine-readable storage medium, where the machine-readable storage medium stores machine-executable instructions, and when the machine-executable instructions are called and executed by a processor, the machine-executable instructions cause the processor to implement the vehicle positioning method.
The vehicle positioning method, the vehicle positioning device, and the computer program product of the electronic device provided in the embodiments of the present invention include a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiments, and specific implementation may refer to the method embodiments, and will not be described herein again.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A vehicle positioning method, characterized in that the method comprises:
acquiring a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, and a state transition equation and initial parameters of an extended Kalman filter;
screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair does not intersect with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance;
acquiring target relative displacement between each target matching point pair in each target lane line matching pair;
and inputting the target relative displacement between each target matching point pair, the state transition equation and the initial parameters into a Kalman filter to obtain a positioning result of the vehicle.
2. The method of claim 1, wherein the step of screening at least one target lane line matching pair satisfying a preset condition from among the plurality of first lane line matching pairs comprises:
deleting the specified lane line matching pairs from the plurality of first lane line matching pairs to obtain at least one target lane line matching pair meeting preset conditions; wherein, the specified lane line matching pair comprises: under the scene of the viaduct section, the visual perception lane lines and the map lane lines are crossed with each other.
3. The method of claim 1, wherein the step of screening at least one target lane line matching pair satisfying a preset condition from among the plurality of first lane line matching pairs comprises:
aiming at each first lane line matching pair, acquiring a plurality of first matching point pairs in the current first lane line matching pair;
calculating an average relative displacement of a plurality of first relative displacements according to a first relative displacement between two matching points in each first matching point pair;
if the average relative displacement is larger than a first preset distance, deleting the current first lane line matching pair;
and taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
4. The method of claim 1, wherein the step of screening at least one target lane line matching pair satisfying a preset condition from among the plurality of first lane line matching pairs comprises:
aiming at each first lane line matching pair, acquiring a plurality of first matching point pairs in the current first lane line matching pair;
calculating a second relative displacement between two matching points in each first matching point pair;
selecting a plurality of designated matching point pairs with second relative displacement smaller than a second preset distance from the plurality of first matching point pairs;
determining the lane line matching sections corresponding to the designated matching point pairs as target lane line matching pairs which are corresponding to the current first lane line matching and meet preset conditions;
and taking the next first lane line matching pair as a new current first lane line matching pair, and repeatedly executing the step of obtaining a plurality of first matching point pairs in the current first lane line matching pair to obtain at least one target lane line matching pair meeting preset conditions.
5. The method according to claim 4, wherein before the step of determining the lane line matching segment corresponding to the plurality of designated matching point pairs as the target lane line matching pair corresponding to the current first lane line matching and satisfying the preset condition, the method further comprises:
and if the plurality of designated matching point pairs are a plurality of continuous matching point pairs, generating lane line matching sections corresponding to the plurality of designated matching point pairs based on the plurality of continuous matching point pairs.
6. The method of claim 1, wherein the target relative displacement between each target matching point pair, the state transition equation and the initial parameters are input into a kalman filter, and the step of obtaining the positioning result of the vehicle comprises:
inputting the target relative displacement between each target matching point pair, the state transition equation and the initial parameter into a Kalman filter, and outputting a first Jacobian matrix corresponding to the state transition equation;
outputting a positioning predicted value and an error covariance predicted value of the vehicle based on the state transition equation and the first Jacobian matrix;
determining a positioning result of the vehicle based on the positioning prediction value and the error covariance prediction value.
7. The method of claim 6, wherein determining the positioning result of the vehicle based on the positioning prediction value and the error covariance prediction value comprises:
acquiring an observation equation of the extended Kalman filter;
determining a second Jacobian matrix corresponding to the observation equation based on the observation equation of the extended Kalman filter;
determining an updated Kalman gain based on the observation equation, the second Jacobian matrix and the error covariance prediction value;
updating the positioning predicted value based on the positioning predicted value, the updated Kalman gain and the observation equation to obtain a positioning estimated value of the vehicle at the current moment, and determining the positioning estimated value as a positioning result of the vehicle.
8. A vehicle locating apparatus, characterized in that the apparatus comprises:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a plurality of first lane line matching pairs corresponding to mutually matched visual perception lane lines and map lane lines, a state transition equation of an extended Kalman filter and initial parameters;
the screening module is used for screening at least one target lane line matching pair meeting preset conditions from the plurality of first lane line matching pairs; wherein the preset conditions include: the visual perception lane line in the target lane line matching pair does not intersect with the map lane line, and/or the relative displacement between the visual perception lane line in the target lane line matching pair and the map lane line is smaller than a preset distance;
the second acquisition module is used for acquiring the target relative displacement between each target matching point pair in each target lane line matching pair;
and the third acquisition module is used for inputting the target relative displacement between each target matching point pair, the state transition equation and the initial parameter into a Kalman filter to obtain a positioning result of the vehicle.
9. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the vehicle localization method of any one of claims 1-7.
10. A machine-readable storage medium having stored thereon machine-executable instructions which, when invoked and executed by a processor, cause the processor to implement the vehicle localization method of any one of claims 1-7.
CN202210012621.6A 2022-01-07 2022-01-07 Vehicle positioning method and device and electronic equipment Pending CN114323050A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210012621.6A CN114323050A (en) 2022-01-07 2022-01-07 Vehicle positioning method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210012621.6A CN114323050A (en) 2022-01-07 2022-01-07 Vehicle positioning method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114323050A true CN114323050A (en) 2022-04-12

Family

ID=81025588

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210012621.6A Pending CN114323050A (en) 2022-01-07 2022-01-07 Vehicle positioning method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114323050A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115950441A (en) * 2023-03-08 2023-04-11 智道网联科技(北京)有限公司 Fusion positioning method and device for automatic driving vehicle and electronic equipment
CN117490728A (en) * 2023-12-28 2024-02-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103954275A (en) * 2014-04-01 2014-07-30 西安交通大学 Lane line detection and GIS map information development-based vision navigation method
CN108413971A (en) * 2017-12-29 2018-08-17 驭势科技(北京)有限公司 Vehicle positioning technology based on lane line and application
JP2018151822A (en) * 2017-03-13 2018-09-27 アルパイン株式会社 Electronic device, driving lane detection program and driving lane detection method
CN110556012A (en) * 2019-09-16 2019-12-10 北京百度网讯科技有限公司 Lane positioning method and vehicle positioning system
CN111582079A (en) * 2020-04-24 2020-08-25 杭州鸿泉物联网技术股份有限公司 Lane positioning method and device based on computer vision
CN112964260A (en) * 2021-02-01 2021-06-15 东风商用车有限公司 Automatic driving positioning method, device, equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103954275A (en) * 2014-04-01 2014-07-30 西安交通大学 Lane line detection and GIS map information development-based vision navigation method
JP2018151822A (en) * 2017-03-13 2018-09-27 アルパイン株式会社 Electronic device, driving lane detection program and driving lane detection method
CN108413971A (en) * 2017-12-29 2018-08-17 驭势科技(北京)有限公司 Vehicle positioning technology based on lane line and application
CN110556012A (en) * 2019-09-16 2019-12-10 北京百度网讯科技有限公司 Lane positioning method and vehicle positioning system
CN111582079A (en) * 2020-04-24 2020-08-25 杭州鸿泉物联网技术股份有限公司 Lane positioning method and device based on computer vision
CN112964260A (en) * 2021-02-01 2021-06-15 东风商用车有限公司 Automatic driving positioning method, device, equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115950441A (en) * 2023-03-08 2023-04-11 智道网联科技(北京)有限公司 Fusion positioning method and device for automatic driving vehicle and electronic equipment
CN117490728A (en) * 2023-12-28 2024-02-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system
CN117490728B (en) * 2023-12-28 2024-04-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system

Similar Documents

Publication Publication Date Title
CN111307162B (en) Multi-sensor fusion positioning method for automatic driving scene
CN110709890B (en) Map data correction method and device
CN109946732B (en) Unmanned vehicle positioning method based on multi-sensor data fusion
US10384679B2 (en) Travel control method and travel control apparatus
Rose et al. An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS
Matthaei et al. Map-relative localization in lane-level maps for ADAS and autonomous driving
CN107328411A (en) Vehicle positioning system and automatic driving vehicle
CN107328410A (en) Method and automobile computer for positioning automatic driving vehicle
Barjenbruch et al. Joint spatial-and Doppler-based ego-motion estimation for automotive radars
CN111582079A (en) Lane positioning method and device based on computer vision
CN114111775B (en) Multi-sensor fusion positioning method and device, storage medium and electronic equipment
CN114323050A (en) Vehicle positioning method and device and electronic equipment
CN114636993A (en) External parameter calibration method, device and equipment for laser radar and IMU
JP4596566B2 (en) Self-vehicle information recognition device and self-vehicle information recognition method
WO2019208101A1 (en) Position estimating device
WO2022041706A1 (en) Positioning method, positioning system, and vehicle
CN113252022A (en) Map data processing method and device
CN112154303B (en) High-precision map positioning method, system, platform and computer readable storage medium
JP7010535B2 (en) Information processing equipment
CN116399324A (en) Picture construction method and device, controller and unmanned vehicle
CN116147605A (en) Vehicle automatic driving map generation method, device, equipment and storage medium
CN115456898A (en) Method and device for building image of parking lot, vehicle and storage medium
CN113405555B (en) Automatic driving positioning sensing method, system and device
WO2022133986A1 (en) Accuracy estimation method and system
Velat et al. Vision based vehicle localization for autonomous navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination