CN114034307B - Vehicle pose calibration method and device based on lane lines and electronic equipment - Google Patents

Vehicle pose calibration method and device based on lane lines and electronic equipment Download PDF

Info

Publication number
CN114034307B
CN114034307B CN202111375141.8A CN202111375141A CN114034307B CN 114034307 B CN114034307 B CN 114034307B CN 202111375141 A CN202111375141 A CN 202111375141A CN 114034307 B CN114034307 B CN 114034307B
Authority
CN
China
Prior art keywords
vehicle
lane line
information
coordinate system
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111375141.8A
Other languages
Chinese (zh)
Other versions
CN114034307A (en
Inventor
王林杰
张海强
李成军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhidao Network Technology Beijing Co Ltd
Original Assignee
Zhidao Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhidao Network Technology Beijing Co Ltd filed Critical Zhidao Network Technology Beijing Co Ltd
Priority to CN202111375141.8A priority Critical patent/CN114034307B/en
Publication of CN114034307A publication Critical patent/CN114034307A/en
Application granted granted Critical
Publication of CN114034307B publication Critical patent/CN114034307B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system

Abstract

The application relates to a lane line-based vehicle pose calibration method, a lane line-based vehicle pose calibration device and electronic equipment. The method comprises the following steps: acquiring first lane line information in an external environment image of the current position of the vehicle and acquiring second lane line information of the current position of the vehicle in a corresponding high-precision map; acquiring a first sampling point corresponding to first lane line information in a vehicle coordinate system and a second sampling point corresponding to second lane line information in the vehicle coordinate system; matching the first sampling point with a corresponding second sampling point to obtain a pose calibration quantity; and calibrating the current pose information of the vehicle according to the pose calibration quantity to obtain the calibrated pose information of the vehicle. According to the scheme provided by the application, the vehicle pose can be calibrated, and the positioning accuracy and the robustness of the vehicle are improved.

Description

Vehicle pose calibration method and device based on lane lines and electronic equipment
Technical Field
The application relates to the technical field of navigation, in particular to a lane line-based vehicle pose calibration method, a lane line-based vehicle pose calibration device and electronic equipment.
Background
The essence of the autopilot technology of a vehicle is a control process of vehicle tracking. The position and the pose of the vehicle are critical to the realization of automatic driving, are preconditions for the sensing decision of a vehicle sensing unit and a control unit, and the accuracy of the position of the vehicle in a lane during driving, namely the transverse positioning performance of the vehicle, is related to the safe driving of the vehicle.
In the related art, automatic driving is often combined with inertial navigation, satellite navigation and odometer navigation. Due to the influences of satellite availability, inertial navigation performance, odometer accumulated errors and the like, the vehicle pose obtained by the positioning method has deviation from the actual pose of the vehicle, and particularly, the positioning requirement of automatic driving is difficult to meet in the scenes of unstable GPS signals of tunnels, urban high buildings and the like.
Disclosure of Invention
In order to solve or partially solve the problems in the related art, the application provides a lane line-based vehicle pose calibration method, which can calibrate the vehicle pose and improve the positioning accuracy and the robustness of the vehicle.
The first aspect of the application provides a vehicle pose calibration method based on lane lines, comprising the following steps:
acquiring first lane line information in an external environment image of a current position of a vehicle and acquiring second lane line information of the current position of the vehicle in a corresponding high-precision map;
acquiring a first sampling point corresponding to the first lane line information in a vehicle coordinate system and acquiring a second sampling point corresponding to the second lane line information in the vehicle coordinate system;
matching the first sampling points with the same identification type with the second sampling points to obtain corresponding pose calibration quantity;
and calibrating the current pose information of the vehicle according to the pose calibration quantity to obtain the calibrated vehicle pose information.
In an embodiment, the acquiring the first lane line information in the external environment image of the current position of the vehicle includes:
collecting an external environment image of the current position of the vehicle;
and identifying first lane line information in a first preset range in the external environment image and corresponding identification types through semantic segmentation.
In an embodiment, the obtaining the second lane line information of the current position of the vehicle in the corresponding high-precision map includes:
and acquiring second lane line information in a second preset range in a high-precision map according to the current longitude and latitude and the current pose information of the vehicle.
In an embodiment, the first preset range includes 20 meters to 30 meters from the current position of the vehicle along the driving direction; the second preset range comprises 20-30 meters away from the current longitude and latitude along the driving direction in the high-precision map.
In an embodiment, the acquiring the first sampling point corresponding to the first lane line information in the vehicle coordinate system includes:
performing point cloud representation on the first lane line information to generate a first point cloud;
converting the coordinates of the first point cloud in the image coordinate system into coordinates in the vehicle coordinate system according to camera parameters;
fitting to generate a first line according to the coordinates of the first point cloud in the vehicle coordinate system;
a plurality of first sampling points are extracted at the first pattern.
In an embodiment, the acquiring the second sampling point corresponding to the second lane line information in the vehicle coordinate system includes:
performing point cloud representation on the second lane line information to generate a second point cloud;
converting the coordinates of the second point cloud in the geodetic coordinate system into coordinates in a vehicle coordinate system according to the current pose information;
fitting to generate a second line type according to the coordinates of the second point cloud in the vehicle coordinate system;
a plurality of second sampling points are extracted at the second line type.
In an embodiment, the matching the first sampling point with the corresponding second sampling point to obtain the pose calibration quantity includes:
and obtaining the pose calibration quantity according to the Euclidean distance error of each first sampling point and each second sampling point, the vertical distance error between the first sampling point and the second line type and the parallelism error of the first line type and the second line type respectively.
A second aspect of the present application provides a lane line-based vehicle pose calibration device, comprising:
the identification information acquisition module is used for acquiring first lane line information in an external environment image of the current position of the vehicle and acquiring second lane line information of the current position of the vehicle in a corresponding high-precision map;
the sampling point acquisition module is used for acquiring a first sampling point corresponding to the first lane line information in a vehicle coordinate system and acquiring a second sampling point corresponding to the second lane line information in the vehicle coordinate system;
the matching module is used for matching the first sampling points with the same identification type with the second sampling points to obtain corresponding pose calibration quantity;
and the calibration module is used for calibrating the current pose information of the vehicle according to the pose calibration quantity to obtain the calibrated pose information of the vehicle.
A third aspect of the present application provides an electronic device, comprising:
a processor; and
a memory having executable code stored thereon which, when executed by the processor, causes the processor to perform the method as described above.
A fourth aspect of the present application provides a computer readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform a method as described above.
The technical scheme that this application provided can include following beneficial effect:
according to the pose optimization method based on the lane lines, corresponding first sampling points and second sampling points are obtained according to first lane line information in a current external environment image of a vehicle and second lane line information of a current position of the vehicle in a corresponding high-precision map; the first sampling point and the second sampling point are matched to obtain the pose calibration quantity, so that the current pose information can be calibrated according to the pose calibration quantity. By means of the design, the pose calibration quantity can be obtained by means of different types of lane line information, so that the accurate pose calibration quantity can be obtained, the calibrated vehicle pose information can be obtained rapidly and accurately, the accuracy and the robustness of the positioning information are improved, the auxiliary positioning under the condition that GPS signals are unstable is facilitated, and the popularization of the automatic driving technology is facilitated.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following more particular descriptions of exemplary embodiments of the application as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the exemplary embodiments of the application.
FIG. 1 is a flow chart of a lane-based vehicle pose calibration method according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating matching of a first sampling point and a second sampling point according to an embodiment of the present disclosure;
FIG. 3 is another flow chart of a lane-based vehicle pose calibration method according to an embodiment of the present application;
fig. 4 is a schematic structural view of a lane-based vehicle pose calibration apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The terminology used in the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms "first," "second," "third," etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first message may also be referred to as a second message, and similarly, a second message may also be referred to as a first message, without departing from the scope of the present application. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the related art, when a vehicle runs in a city building or a tunnel, a GPS signal is unstable due to environmental factors, so that deviation exists in GPS positioning or odometer information of the vehicle, and positioning accuracy of the vehicle during automatic driving is affected.
Aiming at the problems, the embodiment of the application provides a vehicle pose calibration method based on lane lines, which can calibrate the vehicle pose and improve the positioning accuracy and the robustness of the vehicle.
The following describes the technical scheme of the embodiments of the present application in detail with reference to the accompanying drawings.
Fig. 1 is a flow chart of a lane-based vehicle pose calibration method according to an embodiment of the present application.
Referring to fig. 1, a lane line-based vehicle pose calibration method according to an embodiment of the present application includes:
step S110, acquiring first lane line information in an external environment image of a current position of the vehicle and acquiring second lane line information of the current position of the vehicle in a corresponding high-precision map.
During running of the vehicle, an external environment can be photographed by a camera mounted to the vehicle body, thereby obtaining an external environment image. When the GPS signal intensity is detected to be lower than the preset intensity threshold value, an external environment image can be shot through a camera. It is understood that the external environment image may be an image in front of the vehicle traveling direction, so that a lane image in front of the current position of the vehicle may be obtained. The first lane line information is various lane lines in the external environment image, such as solid lines and/or dashed lines. It will be appreciated that, depending on the current position of the vehicle, the lane corresponding to the current position of the vehicle may include one or more lane lines, or may not have any lane lines, depending on the actual situation. Therefore, in the same external environment image obtained by photographing, there may be lane line information or there may be no first lane line information. That is, in the same frame of external environment image, if one or more lane lines are included, there is a corresponding number of first lane line information. In other embodiments, if no first lane line information exists in the external environment image, the camera may continue to shoot at a preset period until the first lane line information can be recognized and obtained in the external environment image.
Similarly, the second lane line information of the current position of the vehicle in the corresponding high-precision map can be synchronously acquired while the first lane line information is shot. In order to ensure that the subsequent steps can be matched accurately, the second lane line information and the first lane line information have the same position information and the same acquisition range. It can be understood that the high-precision map is a high-precision map, which not only has high-precision coordinates, but also has accurate road shape, and the gradient, curvature, heading, elevation and roll data of each lane are also included; in addition, the identification type of the lane lines on each lane, the color of the lane lines, the isolation belt of the road, the arrows and the characters on the indication board on the road are all presented in the high-precision map. Thus, the second lane line information of the current position of the vehicle in the high-definition map, that is, the lane line on the lane corresponding to the first lane line in the high-definition map is acquired. It is understood that the number of second lane line information corresponds to the number of lane lines actually existing in the high-definition map.
Step S120, a first sampling point corresponding to the first lane line information in the vehicle coordinate system and a second sampling point corresponding to the second lane line information in the vehicle coordinate system are obtained.
The vehicle coordinate system is an euclidean coordinate system with the vehicle itself as an origin, namely, a coordinate system established on the euclidean geometry. Specifically, the vehicle coordinate system may be a vehicle coordinate system in which the center of the rear axle of the vehicle is taken as an origin, the direction of the vehicle head is the positive direction of the x-axis, the left side of the vehicle body is the positive direction of the y-axis, and the vertical upward direction is the positive direction of the z-axis (in accordance with the right-hand rule). It should be understood that the first lane line information is derived from information in the external environment image, and each first lane line information needs to be subjected to coordinate transformation into a vehicle coordinate system, and then a first sampling point is further acquired. In one embodiment, the first lane line information is subjected to point cloud representation to generate a first point cloud; converting the coordinates of the first point cloud in the image coordinate system into coordinates in the vehicle coordinate system according to the camera parameters; fitting to generate a first line according to the coordinates of the first point cloud in a vehicle coordinate system; a plurality of first sampling points are obtained at the first pattern extraction. That is, after each first lane line information is represented by a point cloud, corresponding coordinates of points in each point cloud in an image coordinate system in an external environment image are respectively obtained, then the corresponding coordinates are converted into a vehicle coordinate system according to camera parameters such as a camera internal reference matrix and a camera external reference matrix, each converted point is respectively correspondingly matched into a line in the vehicle coordinate system, the first lane line information in each external environment image is respectively matched into a corresponding first line type, and then a plurality of first sampling points are extracted on the corresponding first line type according to a preset rule.
Further, the coordinates (u, v) corresponding to the first point cloud in the external environment image may be converted into the vehicle coordinate system according to the following formula (1).
Wherein the method comprises the steps ofIs a scale factor (S)>For points in the vehicle coordinate system +.>An extrinsic matrix representing the camera relative to the centre of the vehicle, < >>Represents the i-th row,/-using the extrinsic matrix>Representing the internal reference matrix of the camera.
Further, the second lane line information belongs to information in a high-precision map, and has corresponding GPS coordinates, i.e., coordinates located in a geodetic coordinate system (e.g., WGS-84 coordinate system); and converting the coordinates into a vehicle coordinate system, and further acquiring a second sampling point. In one embodiment, the second lane line information is subjected to point cloud representation to generate a second point cloud; converting the coordinates of the second point cloud in the geodetic coordinate system into coordinates in the vehicle coordinate system according to the current pose information; fitting to generate a second line type according to the coordinates of the second point cloud in the vehicle coordinate system; a plurality of second sampling points are extracted in a second line pattern. That is, after the second lane line information is represented by the point cloud, corresponding coordinates of points in the point cloud in the geodetic coordinate system are obtained, the coordinates are converted into the vehicle coordinate system by the related technology, the converted points are correspondingly fitted into lines, and a plurality of second sampling points are correspondingly extracted on each second line according to a preset rule. Similarly, when the second lane line information comprises more than one piece, fitting and generating a corresponding number of second line types in the vehicle coordinate system, and respectively obtaining corresponding second sampling points on each second line type.
In an embodiment, each of the first sampling points and the second sampling points may be a plurality of sampling points extracted at preset intervals in a corresponding line type. It is to be understood that when the first lane line information and the second lane line information are solid lines or broken lines, the first line type and the second line type obtained by fitting in the coordinate system of the vehicle are both solid lines, not broken lines. Each first sampling point and each second sampling point have corresponding three-dimensional coordinates in the vehicle coordinate system, namely, each sampling point belongs to a 3D point in the vehicle coordinate system. In order to facilitate matching, in an embodiment, the number of first sampling points on the first line type and the second line type corresponding to each other is the same as the number of extracted second sampling points, that is, the number of first sampling points extracted on the first line type is the same as the number of second sampling points extracted on the second line type.
And step S130, matching the first sampling point with the corresponding second sampling point to obtain the pose calibration quantity.
Because the first sampling point and the second sampling point are both positioned in the vehicle coordinate system, each sampling point positioned in the same coordinate system can be matched according to the corresponding lane line. For example, when the first lane line information includes a solid line and a broken line, the second lane line information also includes a solid line and a broken line, the first sampling point belonging to the solid line is matched with the second sampling point belonging to the solid line, and the first sampling point belonging to the broken line is matched with the second sampling point belonging to the broken line. Further, in an embodiment, the pose calibration amount is obtained according to the euclidean distance error between each first sampling point and each second sampling point, the vertical distance error between the first sampling point and the second line type, and the parallelism error between the first line type and the second line type. Specifically, the matching method of the first sampling point and the second sampling point can be used for matching through an ICP point cloud registration method, the following 3 error functions are respectively obtained, and the pose calibration quantity is comprehensively obtained according to the 3 error functions. The error function specifically includes: 1. a function for calculating a euclidean distance error between the three-dimensional coordinates of each first sampling point of the first line type and the three-dimensional coordinates of the corresponding second sampling point; 2. a function for calculating a vertical distance error between each first sampling point of the first line type and the corresponding second line type; 3. and a function for calculating and obtaining the parallelism error of the first line type and the second line type. It can be understood that when the number of the first line patterns and the second line patterns includes two or more, all the first sampling points on all the first line patterns and all the second sampling points on all the second line patterns are matched according to the above scheme, and finally, one pose calibration amount is obtained.
For ease of understanding, e.g.As shown in fig. 2, a plurality of spaced first sampling points P are extracted from the first line AB, and a plurality of spaced second sampling points Q are extracted from the second line CD. Wherein, the straight line distance D between each first sampling point P and each second sampling point Q 1 Namely, european distance error; the vertical distance D from each first sampling point P to the second line CD 2 Namely, vertical distance error; the included angle β between the first line type AB and the second line type CD is the parallelism error. And step S140, calibrating the current pose information of the vehicle according to the pose calibration quantity to obtain the calibrated pose information of the vehicle.
The current pose information of the vehicle can be obtained according to the odometer information, and the current pose information comprises current position coordinates (x, y, z) and rotation angles of the vehicle in a UTM coordinate system. And calibrating the current pose information according to the pose calibration quantity to obtain the calibrated vehicle pose information. It can be understood that according to the calibrated vehicle pose information, auxiliary positioning under the condition of unstable GPS signals or inaccurate odometer information can be realized, so that positioning of scenes such as automatic driving, unmanned driving and the like is facilitated.
According to the pose optimization method based on the lane lines, corresponding first sampling points and second sampling points are obtained according to first lane line information in a current external environment image of a vehicle and second lane line information of a current position of the vehicle in a corresponding high-precision map; the first sampling point and the second sampling point are matched to obtain the pose calibration quantity, so that the current pose information can be calibrated according to the pose calibration quantity. By means of the design, the pose calibration quantity can be obtained through the lane line information, so that the accurate pose calibration quantity can be obtained, the calibrated vehicle pose information can be obtained rapidly and accurately, the accuracy and the robustness of the positioning information are improved, the auxiliary positioning under the condition that GPS signals are unstable is facilitated, and the popularization of the automatic driving technology is facilitated.
Fig. 3 is another flow chart of a lane-based vehicle pose calibration method according to an embodiment of the present application.
Referring to fig. 3, a lane line-based vehicle pose calibration method according to an embodiment of the present application includes:
step S210, collecting an external environment image of the current position of the vehicle; and identifying first lane line information in a first preset range in the external environment image through semantic segmentation.
When the GPS signal intensity is detected to be lower than a preset intensity threshold value, an external environment image of the current position of the vehicle in front of the running direction can be acquired in real time through a camera arranged on the vehicle body, so that the external environment image comprises a lane line of a lane where the vehicle is located. It will be appreciated that the captured external ambient image may contain scenes other than a few tens of meters when no obstruction is present directly in front of the vehicle. In order to improve accuracy of the recognition result, the first preset range may be 20 meters to 30 meters away from the vehicle in the driving direction, so that only lane lines within 20 meters to 30 meters away from the vehicle, that is, first lane line information, may be recognized. It will be appreciated that in other embodiments, a camera mounted to the body may also be used to capture an external environmental image of the immediate front of the current position of the vehicle, away from the direction of travel, in real time.
Further, each lane line on the lane in the external environment image may be obtained according to the semantic segmentation method in the related art. The first lane line information comprises line types and positions corresponding to the lane lines, wherein the line types are double-solid lines, single-solid lines, double-dashed lines, single-dashed lines and/or virtual-real lines; the location may be on the left or right side of the vehicle; according to the specific content contained in the first lane line information, the first sampling points on the first line type and the second sampling points obtained by converting the second lane line information with the same line type and the same position are matched one by one in the follow-up steps.
Step S220, second lane line information in a second preset range in the high-precision map is obtained according to the current longitude and latitude and the current pose information of the vehicle.
When the GPS signal intensity is detected to be lower than the preset intensity threshold value, second lane line information in a second preset range corresponding to the high-precision map can be obtained in real time. In order to reduce the data processing load of the system, the running direction of the vehicle in the high-precision map can be determined according to the current longitude and latitude and the current pose information of the vehicle. Further, each lane line on the lane in the range of 20 meters to 30 meters right ahead in the traveling direction with the current longitude and latitude as a starting point, which is the second preset range, may be acquired as second lane line information. The second lane line information also includes a corresponding line type and position. It can be understood that when the first preset range in the above step is a range deviating from the driving direction, the second preset range is also a range deviating from the driving direction, so as to ensure that the first sampling point and the second sampling point in the same range are obtained later, and ensure the accuracy of the matching object. Further, the second lane line information is stored in the high-precision map in advance, and the line type and the position corresponding to each second lane line information can be obtained while the second lane line information is obtained.
In order to ensure that the subsequent steps obtain a first sampling point and a second sampling point within the same geographical area, the second preset range is the same as the first preset range. The preset ranges of the first lane line information and the second lane line information may be set to the same preset distance, for example, each of the first lane line information and the second lane line information within 30 meters from the current position of the vehicle.
It is understood that the above steps S210 and S220 may be performed in no sequence or simultaneously.
Step S230, a first sampling point corresponding to the first lane line information in the vehicle coordinate system and a second sampling point corresponding to the second lane line information in the vehicle coordinate system are obtained.
The description of this step can refer to the above step S120, and will not be repeated here.
And step S240, registering the first sampling point and the corresponding second sampling point through point cloud to obtain pose calibration quantity.
After the first sampling point and the second sampling point corresponding to each line type are obtained, each first sampling point on each first line type and each second sampling point corresponding to each second line type are matched in real time, namely one-to-one accurate matching is carried out, and thus pose calibration quantity is obtained.
The pose calibration amounts q and p may be calculated according to the pose calculation formula of ICP (Iterative Closest Point) in the following formula (2) and the closest point is iterated to solve for them.
Wherein q is a rotation parameter represented by a quaternion, p is a translation parameter, R (q) is a conversion relation of the quaternion to a rotation matrix,representing a first sampling point corresponding to first lane line information in an external environment image under a vehicle coordinate system, < ->And representing a second sampling point corresponding to second lane line information in the high-precision map under the vehicle coordinate system.
Step S250, calibrating the current pose information of the vehicle according to the pose calibration quantity to obtain the calibrated pose information of the vehicle.
It will be appreciated that the current pose information of the vehicle in the odometer belongs to a bias pose, i.e. has a bias compared to the true pose. After the pose calibration quantity is obtained through the first sampling point and the second sampling point, the current pose information of the vehicle is fused according to the pose calibration quantity, so that the current pose information is calibrated, and the auxiliary positioning is performed when the GPS signal is weak and the odometer information is inaccurate.
Further, the corrected vehicle pose information may be obtained by performing calculation according to the pose calibration amount of the above (2) and the current pose information according to the following formulas (3) and (4).
Wherein,representing the conversion of a rotation matrix into quaternions, < ->Andp c representing [ q p ] calculated in equation (2)],Representing the pose of the current odometer +.>And representing the calibrated vehicle pose information, namely the accurate pose. c represents the abbreviation of current and g represents the abbreviation of global (odometer)
As can be seen from the above examples, according to the lane line-based vehicle pose calibration method, by acquiring the first lane line information and the second lane line information within the same preset range, all the first sampling points and all the second sampling points within the same preset range can be matched in one-to-one manner in real time, and the pose calibration amount is obtained through calculation, so that the current pose information of the vehicle can be calibrated according to the pose calibration amount, and therefore the robustness of calibration is improved, and the accuracy of vehicle positioning is improved.
Corresponding to the embodiment of the application function implementation method, the application further provides a vehicle pose calibration device based on the lane line, electronic equipment and corresponding embodiments.
Fig. 4 is a schematic structural view of a lane-based vehicle pose calibration apparatus according to an embodiment of the present application.
Referring to fig. 4, the lane line-based vehicle pose calibration device shown in the embodiments of the present application includes an identification information obtaining module 310, a sampling point obtaining module 320, a matching module 330, and a calibration module 340, where:
the identification information obtaining module 310 is configured to obtain first lane line information in an external environment image of a current position of a vehicle and obtain second lane line information of the current position of the vehicle in a corresponding high-precision map.
The sampling point obtaining module 320 is configured to obtain a first sampling point corresponding to the first lane line information in the vehicle coordinate system and obtain a second sampling point corresponding to the second lane line information in the vehicle coordinate system.
The matching module 330 is configured to match the first sampling point with a corresponding second sampling point, and obtain a pose calibration amount.
The calibration module 340 is configured to calibrate current pose information of the vehicle according to the pose calibration amount, and obtain the calibrated pose information of the vehicle.
Further, the identification information acquisition module 310 is configured to acquire an external environment image of the current position of the vehicle; and identifying first lane line information and corresponding identification types in a first preset range in the external environment image through semantic segmentation. The identification information obtaining module 310 is configured to obtain second lane line information in a second preset range in the high-precision map according to the current longitude and latitude and the current pose information of the vehicle. The sampling point obtaining module 320 is configured to perform point cloud representation on the first lane line information, and generate a first point cloud; converting the coordinates of the first point cloud in the image coordinate system into coordinates in the vehicle coordinate system according to the camera parameters; fitting to generate a first line according to the coordinates of the first point cloud in a vehicle coordinate system; a plurality of first sampling points are extracted at a first pattern. The sampling point obtaining module 320 is configured to perform point cloud representation on the first lane line information, and generate a second point cloud; converting the coordinates of the second point cloud in the geodetic coordinate system into coordinates in the vehicle coordinate system according to the current pose information; fitting to generate a second line type according to the coordinates of the second point cloud in the vehicle coordinate system; a plurality of second sampling points are extracted in a second line pattern. The matching module 330 is configured to obtain the pose calibration amount according to the euclidean distance error between each of the first sampling point and the second sampling point, the vertical distance error between the first sampling point and the second line type, and the parallelism error between the first line type and the second line type, respectively.
According to the lane line-based vehicle pose calibration device, the pose calibration quantity can be obtained by means of different types of lane line information, so that the accurate pose calibration quantity can be obtained, the calibrated vehicle pose information can be obtained rapidly and accurately, the accuracy and the robustness of positioning information are improved, the auxiliary positioning under the condition of unstable GPS signals is facilitated, and the popularization of automatic driving technology is facilitated.
The specific manner in which the respective modules perform the operations in the apparatus of the above embodiments has been described in detail in the embodiments related to the method, and will not be described in detail herein.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Referring to fig. 5, the electronic device 1000 includes a memory 1010 and a processor 1020.
The processor 1020 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Memory 1010 may include various types of storage units, such as system memory, read Only Memory (ROM), and persistent storage. Where the ROM may store static data or instructions that are required by the processor 1020 or other modules of the computer. The persistent storage may be a readable and writable storage. The persistent storage may be a non-volatile memory device that does not lose stored instructions and data even after the computer is powered down. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the persistent storage may be a removable storage device (e.g., diskette, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as dynamic random access memory. The system memory may store instructions and data that are required by some or all of the processors at runtime. Furthermore, memory 1010 may comprise any combination of computer-readable storage media including various types of semiconductor memory chips (e.g., DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic disks, and/or optical disks may also be employed. In some implementations, memory 1010 may include readable and/or writable removable storage devices such as Compact Discs (CDs), digital versatile discs (e.g., DVD-ROMs, dual-layer DVD-ROMs), blu-ray discs read only, super-density discs, flash memory cards (e.g., SD cards, min SD cards, micro-SD cards, etc.), magnetic floppy disks, and the like. The computer readable storage medium does not contain a carrier wave or an instantaneous electronic signal transmitted by wireless or wired transmission.
The memory 1010 has stored thereon executable code that, when processed by the processor 1020, can cause the processor 1020 to perform some or all of the methods described above.
Furthermore, the method according to the present application may also be implemented as a computer program or computer program product comprising computer program code instructions for performing part or all of the steps of the above-described method of the present application.
Alternatively, the present application may also be embodied as a computer-readable storage medium (or non-transitory machine-readable storage medium or machine-readable storage medium) having stored thereon executable code (or a computer program or computer instruction code) which, when executed by a processor of an electronic device (or a server, etc.), causes the processor to perform part or all of the steps of the above-described methods according to the present application.
The embodiments of the present application have been described above, the foregoing description is exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (7)

1. A vehicle pose calibration method based on lane lines is characterized by comprising the following steps of:
acquiring first lane line information in an external environment image of a current position of a vehicle and second lane line information of the current position of the vehicle in a corresponding high-precision map, wherein the method comprises the steps of acquiring the external environment image of the current position of the vehicle when detecting that the GPS signal intensity is lower than a preset intensity threshold;
identifying first lane line information in a first preset range in the external environment image through semantic segmentation;
acquiring a first sampling point corresponding to the first lane line information in a vehicle coordinate system and a second sampling point corresponding to the second lane line information in the vehicle coordinate system, wherein the method comprises the steps of carrying out point cloud representation on the first lane line information to generate a first point cloud;
converting the coordinates of the first point cloud in the image coordinate system into coordinates in the vehicle coordinate system according to camera parameters;
fitting to generate a first line according to the coordinates of the first point cloud in the vehicle coordinate system;
extracting a plurality of first sampling points from the first pattern;
performing point cloud representation on the second lane line information to generate a second point cloud;
converting the coordinates of the second point cloud in the geodetic coordinate system into coordinates in the vehicle coordinate system according to the current pose information;
fitting to generate a second line type according to the coordinates of the second point cloud in the vehicle coordinate system;
extracting a plurality of second sampling points in the second line type;
matching the first sampling point with the corresponding second sampling point to obtain a pose calibration quantity;
and calibrating the current pose information of the vehicle according to the pose calibration quantity to obtain the calibrated vehicle pose information.
2. The method of claim 1, wherein the obtaining second lane-line information for the current location of the vehicle in the corresponding high-precision map comprises:
and acquiring second lane line information in a second preset range in a high-precision map according to the current longitude and latitude and the current pose information of the vehicle.
3. The method of claim 2, wherein the step of determining the position of the substrate comprises,
the first preset range comprises 20-30 meters away from the current position of the vehicle along the driving direction;
the second preset range comprises 20-30 meters away from the current longitude and latitude along the driving direction in the high-precision map.
4. A method according to any one of claims 1 to 3, wherein said matching the first sampling point with the corresponding second sampling point to obtain a pose calibration quantity comprises:
and obtaining the pose calibration quantity according to the Euclidean distance error of each first sampling point and each second sampling point, the vertical distance error between the first sampling point and the second line type and the parallelism error of the first line type and the second line type respectively.
5. The utility model provides a vehicle position appearance calibrating device based on lane line which characterized in that:
the system comprises an identification information acquisition module, a GPS signal acquisition module and a GPS signal acquisition module, wherein the identification information acquisition module is used for acquiring first lane line information in an external environment image of the current position of a vehicle and acquiring second lane line information of the current position of the vehicle in a corresponding high-precision map, and the identification information acquisition module is used for acquiring the external environment image of the current position of the vehicle when detecting that the GPS signal intensity is lower than a preset intensity threshold value; identifying first lane line information in a first preset range in the external environment image through semantic segmentation;
the sampling point acquisition module is used for acquiring a first sampling point corresponding to the first lane line information in a vehicle coordinate system and a second sampling point corresponding to the second lane line information in the vehicle coordinate system, wherein the sampling point acquisition module comprises the steps of carrying out point cloud representation on the first lane line information to generate a first point cloud; converting the coordinates of the first point cloud in the image coordinate system into coordinates in the vehicle coordinate system according to camera parameters; fitting to generate a first line according to the coordinates of the first point cloud in the vehicle coordinate system; extracting a plurality of first sampling points from the first pattern; performing point cloud representation on the second lane line information to generate a second point cloud; converting the coordinates of the second point cloud in the geodetic coordinate system into coordinates in the vehicle coordinate system according to the current pose information; fitting to generate a second line type according to the coordinates of the second point cloud in the vehicle coordinate system; extracting a plurality of second sampling points in the second line type;
the matching module is used for matching the first sampling points with the same identification type with the second sampling points to obtain corresponding pose calibration quantity;
and the calibration module is used for calibrating the current pose information of the vehicle according to the pose calibration quantity to obtain the calibrated pose information of the vehicle.
6. An electronic device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any of claims 1-4.
7. A computer readable storage medium having stored thereon executable code which when executed by a processor of an electronic device causes the processor to perform the method of any of claims 1-4.
CN202111375141.8A 2021-11-19 2021-11-19 Vehicle pose calibration method and device based on lane lines and electronic equipment Active CN114034307B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111375141.8A CN114034307B (en) 2021-11-19 2021-11-19 Vehicle pose calibration method and device based on lane lines and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111375141.8A CN114034307B (en) 2021-11-19 2021-11-19 Vehicle pose calibration method and device based on lane lines and electronic equipment

Publications (2)

Publication Number Publication Date
CN114034307A CN114034307A (en) 2022-02-11
CN114034307B true CN114034307B (en) 2024-04-16

Family

ID=80138297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111375141.8A Active CN114034307B (en) 2021-11-19 2021-11-19 Vehicle pose calibration method and device based on lane lines and electronic equipment

Country Status (1)

Country Link
CN (1) CN114034307B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114396957B (en) * 2022-02-28 2023-10-13 重庆长安汽车股份有限公司 Positioning pose calibration method based on vision and map lane line matching and automobile
CN114252082B (en) * 2022-03-01 2022-05-17 苏州挚途科技有限公司 Vehicle positioning method and device and electronic equipment
CN114608591B (en) * 2022-03-23 2023-01-10 小米汽车科技有限公司 Vehicle positioning method and device, storage medium, electronic equipment, vehicle and chip
CN115235500B (en) * 2022-09-15 2023-04-14 北京智行者科技股份有限公司 Lane line constraint-based pose correction method and device and all-condition static environment modeling method and device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019060814A (en) * 2017-09-28 2019-04-18 株式会社Subaru Self-driving own vehicle location detection device
CN110567480A (en) * 2019-09-12 2019-12-13 北京百度网讯科技有限公司 Optimization method, device and equipment for vehicle positioning and storage medium
CN111220164A (en) * 2020-01-21 2020-06-02 北京百度网讯科技有限公司 Positioning method, device, equipment and storage medium
CN111242031A (en) * 2020-01-13 2020-06-05 禾多科技(北京)有限公司 Lane line detection method based on high-precision map
CN111750878A (en) * 2019-03-28 2020-10-09 北京初速度科技有限公司 Vehicle pose correction method and device
CN111750881A (en) * 2019-03-29 2020-10-09 北京初速度科技有限公司 Vehicle pose correction method and device based on light pole
CN111998860A (en) * 2020-08-21 2020-11-27 北京百度网讯科技有限公司 Automatic driving positioning data verification method and device, electronic equipment and storage medium
CN112284416A (en) * 2020-10-19 2021-01-29 武汉中海庭数据技术有限公司 Automatic driving positioning information calibration device, method and storage medium
CN112284400A (en) * 2020-12-24 2021-01-29 腾讯科技(深圳)有限公司 Vehicle positioning method and device, electronic equipment and computer readable storage medium
CN113607185A (en) * 2021-10-08 2021-11-05 禾多科技(北京)有限公司 Lane line information display method, lane line information display device, electronic device, and computer-readable medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3893150A1 (en) * 2020-04-09 2021-10-13 Tusimple, Inc. Camera pose estimation techniques

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019060814A (en) * 2017-09-28 2019-04-18 株式会社Subaru Self-driving own vehicle location detection device
CN111750878A (en) * 2019-03-28 2020-10-09 北京初速度科技有限公司 Vehicle pose correction method and device
CN111750881A (en) * 2019-03-29 2020-10-09 北京初速度科技有限公司 Vehicle pose correction method and device based on light pole
CN110567480A (en) * 2019-09-12 2019-12-13 北京百度网讯科技有限公司 Optimization method, device and equipment for vehicle positioning and storage medium
CN111242031A (en) * 2020-01-13 2020-06-05 禾多科技(北京)有限公司 Lane line detection method based on high-precision map
CN111220164A (en) * 2020-01-21 2020-06-02 北京百度网讯科技有限公司 Positioning method, device, equipment and storage medium
CN111998860A (en) * 2020-08-21 2020-11-27 北京百度网讯科技有限公司 Automatic driving positioning data verification method and device, electronic equipment and storage medium
CN112284416A (en) * 2020-10-19 2021-01-29 武汉中海庭数据技术有限公司 Automatic driving positioning information calibration device, method and storage medium
CN112284400A (en) * 2020-12-24 2021-01-29 腾讯科技(深圳)有限公司 Vehicle positioning method and device, electronic equipment and computer readable storage medium
CN113607185A (en) * 2021-10-08 2021-11-05 禾多科技(北京)有限公司 Lane line information display method, lane line information display device, electronic device, and computer-readable medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于视觉、轮速和单轴陀螺仪的清扫车定位;陆逸适 等;同济大学学报(自然科学版);20191231(第S1期);第205-212页 *

Also Published As

Publication number Publication date
CN114034307A (en) 2022-02-11

Similar Documents

Publication Publication Date Title
CN114034307B (en) Vehicle pose calibration method and device based on lane lines and electronic equipment
CN107703528B (en) Visual positioning method and system combined with low-precision GPS in automatic driving
CN114088114B (en) Vehicle pose calibration method and device and electronic equipment
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
CN111065043B (en) System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
JP4596566B2 (en) Self-vehicle information recognition device and self-vehicle information recognition method
WO2020133415A1 (en) Systems and methods for constructing a high-definition map based on landmarks
CN110728720A (en) Method, device, equipment and storage medium for camera calibration
CN116997771A (en) Vehicle, positioning method, device, equipment and computer readable storage medium thereof
CN114705121A (en) Vehicle pose measuring method and device, electronic equipment and storage medium
CN112967393B (en) Correction method and device for vehicle movement track, electronic equipment and storage medium
CN114241062A (en) Camera external parameter determination method and device for automatic driving and computer readable storage medium
CN112595335B (en) Intelligent traffic driving stop line generation method and related device
EP2927635B1 (en) Feature set optimization in vision-based positioning
CN113306559A (en) Compensation for vertical road camber in road shape estimation
WO2020113425A1 (en) Systems and methods for constructing high-definition map
US20220404170A1 (en) Apparatus, method, and computer program for updating map
CN113139031B (en) Method and related device for generating traffic sign for automatic driving
CN115112125A (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN115468576A (en) Automatic driving positioning method and system based on multi-mode data fusion
WO2022133986A1 (en) Accuracy estimation method and system
JP2020008462A (en) Own vehicle location estimation device
CN112348903A (en) Method and device for calibrating external parameters of automobile data recorder and electronic equipment
KR101667484B1 (en) Method and Device for Estimating position of Vehicle Using Digital Map
CN114089317A (en) Multi-device calibration method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant