CN114248782B - Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium - Google Patents

Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium Download PDF

Info

Publication number
CN114248782B
CN114248782B CN202111611135.8A CN202111611135A CN114248782B CN 114248782 B CN114248782 B CN 114248782B CN 202111611135 A CN202111611135 A CN 202111611135A CN 114248782 B CN114248782 B CN 114248782B
Authority
CN
China
Prior art keywords
unmanned vehicle
road surface
value
pitch angle
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111611135.8A
Other languages
Chinese (zh)
Other versions
CN114248782A (en
Inventor
单国航
贾双成
朱磊
李成军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhidao Network Technology Beijing Co Ltd
Original Assignee
Zhidao Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhidao Network Technology Beijing Co Ltd filed Critical Zhidao Network Technology Beijing Co Ltd
Priority to CN202111611135.8A priority Critical patent/CN114248782B/en
Publication of CN114248782A publication Critical patent/CN114248782A/en
Application granted granted Critical
Publication of CN114248782B publication Critical patent/CN114248782B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/11Pitch movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/15Road slope, i.e. the inclination of a road segment in the longitudinal direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application relates to a method and a device for determining a pitch angle of an unmanned vehicle and a computer readable storage medium. The method comprises the following steps: respectively positioning a first positioning point and a second positioning point of a current running road surface of the unmanned vehicle to obtain a gradient value of the current running road surface of the unmanned vehicle; acquiring images of at least two parallel lane lines of a current driving road surface of the unmanned vehicle; according to images of at least two parallel lane lines of the current running road surface of the unmanned vehicle, optimizing an included angle between the unmanned vehicle and the current running road surface to obtain an optimal value of the included angle between the unmanned vehicle and the current running road surface; subtracting the sum of the optimal value and the gradient value from the pitch angle detection value of the unmanned vehicle when the included angle between the unmanned vehicle and the current running road surface is the optimal value, so as to obtain a deviation value between the pitch angle detection value of the unmanned vehicle and the pitch angle true value of the unmanned vehicle; and obtaining the true pitch angle value of the unmanned vehicle according to the pitch angle detection value and the deviation value of the unmanned vehicle at any moment.

Description

Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium
Technical Field
The application relates to the technical field of automatic driving, in particular to a method and a device for determining a pitch angle of an unmanned vehicle and a computer readable storage medium.
Background
In the field of autopilot, vehicle attitude angle information such as roll angle (roll), yaw angle (yaw), pitch angle (pitch) and the like of a vehicle is raw data required for autopilot decision and high-precision map calculation. In these attitude angles, the pitch angle of the vehicle is defined as the angle between the y-axis of the coordinate system in which the positioning sensor mounted on the vehicle is located and the horizontal plane. Because of installation errors (such as non-parallelism between the positioning sensor and the chassis of the vehicle), the real pitch angle of the vehicle is different from the pitch angle detected and output by the positioning sensor, and an offset value exists between the real pitch angle of the vehicle and the pitch angle detected and output by the positioning sensor, and the real pitch angle of the vehicle is obtained by subtracting the offset value from the pitch angle output by the positioning sensor. In the related art, the deviation value is obtained by measuring a vehicle running to a specific place, such as a flat road surface with a certain inclination angle with a horizontal plane, or by calibrating external parameters between the sensors such as the vision equipment and/or the laser radar and the positioning sensor by using more sensors such as the vision equipment and/or the laser radar, and finally obtaining the deviation value through deduction. However, the above-mentioned related art requires a lot of sensors and calibration devices, which are very complicated, in addition to or as well as a specific place.
Disclosure of Invention
In order to solve or partially solve the problems in the related art, the application provides a method, a device and a computer readable storage medium for determining the pitch angle of an unmanned vehicle, so that the real pitch angle of the vehicle can be conveniently determined at any place.
The first aspect of the application provides a method for determining a pitch angle of an unmanned vehicle, which comprises the following steps:
respectively positioning a first positioning point and a second positioning point of a current running road surface of the unmanned vehicle to obtain a gradient value of the current running road surface of the unmanned vehicle, wherein the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently running;
acquiring images of at least two parallel lane lines of a current driving road surface of the unmanned vehicle;
Optimizing an included angle between the unmanned vehicle and the current driving road surface according to images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle to obtain an optimal value of the included angle between the unmanned vehicle and the current driving road surface;
subtracting the sum of the optimal value and the gradient value from a pitch angle detection value of the unmanned vehicle when the included angle between the unmanned vehicle and the current driving road surface takes the optimal value to obtain a deviation value between the pitch angle detection value of the unmanned vehicle and a pitch angle true value of the unmanned vehicle;
And obtaining a pitch angle true value of the unmanned vehicle according to the pitch angle detection value of the unmanned vehicle at any time and the deviation value.
A second aspect of the present application provides an autonomous vehicle pitch angle determination apparatus comprising:
The first acquisition module is used for respectively positioning a first positioning point and a second positioning point of a current running road surface of the unmanned vehicle to acquire a gradient value of the current running road surface of the unmanned vehicle, wherein the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently running;
the second acquisition module is used for acquiring images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle;
the optimization module is used for optimizing the included angle between the unmanned vehicle and the current driving road surface according to the images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle to obtain an optimal value of the included angle between the unmanned vehicle and the current driving road surface;
The calculation module is used for subtracting the sum of the optimal value and the gradient value from the pitch angle detection value of the unmanned vehicle when the included angle between the unmanned vehicle and the current driving road surface takes the optimal value to obtain a deviation value between the pitch angle detection value of the unmanned vehicle and the pitch angle true value of the unmanned vehicle;
And the third acquisition module is used for obtaining the true pitch angle value of the unmanned vehicle according to the pitch angle detection value of the unmanned vehicle at any moment and the deviation value.
A third aspect of the present application provides an electronic apparatus, comprising:
A processor; and
A memory having executable code stored thereon which, when executed by the processor, causes the processor to perform the method as described above.
A fourth aspect of the application provides a computer readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform a method as described above.
The technical scheme provided by the application can comprise the following beneficial effects: on the one hand, the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently running, so that the technical scheme of the embodiment of the application does not need a specific place when being implemented, can be carried out in any place, or can meet the conditions in most places; on the other hand, whether the images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle are acquired, the first positioning point and the second positioning point of the current driving road surface of the unmanned vehicle are positioned, or the gradient value of the current driving road surface of the unmanned vehicle is obtained according to the images, compared with the prior art that the algorithm is complex and the calculated amount is large because a large amount of point cloud data are required to be acquired by the laser radar, the method and the system have the advantages that the included angle between the unmanned vehicle and the current driving road surface is optimized through relatively less calculated amount, the required calculated amount is less in the process of obtaining the deviation value between the pitch angle detection value of the unmanned vehicle and the pitch angle true value of the unmanned vehicle according to the calculated amount, and the like.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following more particular descriptions of exemplary embodiments of the application as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the exemplary embodiments of the application.
Fig. 1 is a flow chart of a method for determining a pitch angle of an unmanned vehicle according to an embodiment of the present application;
Fig. 2 is a schematic diagram of an image captured by a camera of an embodiment of the present application on a road surface on which an unmanned vehicle is currently traveling;
FIG. 3 is a schematic view of parallel lane lines extracted from FIG. 3, illustrating an embodiment of the present application;
FIG. 4 is a schematic view of an unmanned vehicle in a slope road with an angle α with the road surface according to an embodiment of the present application;
fig. 5 is a schematic structural view of an autonomous vehicle pitch angle determining apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While embodiments of the present application are illustrated in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms "first," "second," "third," etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the application. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the automatic driving field, due to installation errors (for example, the positioning sensor is not parallel to the chassis of the vehicle), the real pitch angle of the vehicle is different from the pitch angle detected and output by the positioning sensor, and an offset value exists between the real pitch angle of the vehicle and the pitch angle detected and output by the positioning sensor, and the real pitch angle of the vehicle is obtained by subtracting the offset value from the pitch angle output by the positioning sensor. In the related art, a vehicle is driven to a specific place, for example, a flat road surface with a certain inclination angle with a horizontal plane, a deviation value is obtained through measurement, or more sensors such as a vision device and/or a laser radar are used, an external parameter T LR between the sensors such as the vision device and/or the laser radar and a positioning sensor is calibrated, then the inclination angle beta of the current flat place/wall surface under a coordinate system of the laser radar is measured by the laser radar, and a pitch angle detection value p is output by the positioning sensor. And then using T LR, beta and p to finally obtain a bias value (bias). However, the former method has specific requirements for the location, and the latter method requires a great number of sensors and calibration devices in addition to the specific location, which is very cumbersome.
Aiming at the problems, the embodiment of the application provides the unmanned vehicle pitch angle determining method, which can conveniently determine the real pitch angle of the vehicle at any place.
The following describes the technical scheme of the embodiment of the present application in detail with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a method for determining a pitch angle of an unmanned vehicle according to an embodiment of the present application mainly includes steps S101 to S105, and is described as follows:
Step S101: and respectively positioning a first positioning point and a second positioning point of the current running road surface of the unmanned vehicle to obtain the gradient value of the current running road surface of the unmanned vehicle, wherein the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently running.
In the embodiment of the application, the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently running, and the implementation of the technical scheme of the application is not limited; the first positioning point of the current running road surface of the unmanned vehicle may be the current position of the unmanned vehicle on the current running road surface, and the second positioning point may be any position other than the first positioning point. As an embodiment of the present application, the first positioning point and the second positioning point of the current running road surface of the unmanned vehicle are respectively positioned, and the obtaining of the gradient value of the current running road surface of the unmanned vehicle may be: detecting and obtaining the height of a first locating point and the height of a second locating point of the current driving road surface of the unmanned vehicle and the coordinates of the first locating point and the coordinates of the second locating point through a locating system; calculating the distance between the first locating point and the second locating point according to the coordinates of the first locating point and the coordinates of the second locating point; based on the European geometry principle, calculating the gradient value of the current running road surface of the unmanned vehicle according to the height of the first locating point, the height of the second locating point and the distance between the first locating point and the second locating point. In the above embodiment, the signal output by the positioning system may be a Real-time dynamic (Real-TIME KINEMATIC, RTK) signal obtained based on the GPS and the inertial measurement unit, and the coordinate of the first positioning point is recorded as (x 1,y1), the coordinate of the second positioning point is recorded as (x 2,y2), the height of the first positioning point is Altitude 1, the height of the second positioning point is Altitude 2, and the gradient value of the current driving road surface of the unmanned vehicle is slope, which is known based on the european geometry principleHere, D ((x 1,y1),(x2,y2)) represents the distance between the first positioning point and the second positioning point, and tag (slope) represents the tangent function of the tag (slope). After obtaining the tag (slope), obtaining the inverse function of the tag (slope) to obtain/>
Step S102: and acquiring images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle.
Here, the current driving road surface of the unmanned vehicle can be obtained by shooting through a camera carried by the unmanned vehicle, wherein the current driving road surface comprises at least two parallel lane lines. As shown in fig. 2, an image of the road surface on which the unmanned vehicle is currently traveling is captured by a camera mounted on the unmanned vehicle. At least two parallel lane lines can be identified or extracted by using the neural network model, as shown in fig. 3, which is a schematic representation of extracting 4 parallel lane lines from the current driving road surface image of the unmanned vehicle shot in fig. 2.
Step S103: and optimizing the included angle between the unmanned vehicle and the current driving road surface according to the images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle to obtain the optimal value of the included angle between the unmanned vehicle and the current driving road surface.
The relationship among the road surface inclination, i.e., the gradient value, the pitch angle detection value of the vehicle, the angle between the vehicle and the road surface, and the deviation value will be described with reference to fig. 4. As shown in fig. 4, the gradient value of the road surface is denoted by slope, the included angle between the road surface and the horizontal plane is represented, when the road surface is downhill, the value of the slope is negative, and when the road surface is uphill, the value of the slope is positive; the pitch angle detection value of the vehicle is noted as pitch detect; the included angle between the vehicle and the road surface is marked as alpha, the deviation value between the pitch angle detection value of the vehicle and the pitch angle true value of the vehicle, namely zero deviation is marked as bias, and the value of bias is determined by the installation error of the positioning sensor of the vehicle and is irrelevant to the position of the vehicle and road conditions; the relationship between pitch detect, slope, bias and α is:
pitch detect -bias-slope=α............... (equation 1)
The equivalent modification is performed on the above formula 1 to obtain the following formula 2:
pitch detect -bias=α+slope............... (equation 2)
As described in the foregoing background, pitch detect -bias in equation 2 is the pitch true value of the vehicle (denoted as pitch real), i.e
Pitch real=pitchdetect -bias = α+slope..once., (equation 3)
From equation 3 alone, the values of both slope and pitch detect are readily detected by an unmanned on-board positioning sensor, such as an RTK positioning device that integrates a GPS module and an inertial measurement unit (Inertial Measurement Unit, IMU), and how to try to obtain bias or α. Since α is the angle between the vehicle and the road surface, as long as the vehicle runs on the road surface, the α value may be different at different times, so that detecting the α value at a moment is not realistic, or the cost of detecting the α value at a moment in real time is relatively high, while bias is a constant which is determined by the installation error of the positioning sensor of the vehicle and is irrelevant to the position and road condition of the vehicle, as long as the value of bias at a certain specific moment is calculated, the value of bias at the certain specific moment can be used when the vehicle is located at any position of any road surface, and according to formula 1, under the condition that the values of slope and pitch detect can be conveniently detected, the value of bias at a certain specific moment is calculated and obtained as long as the α value at a certain specific moment is calculated, and the specific moment can be the moment when the angle between the vehicle and the current road surface takes the optimal value, therefore, in the embodiment of the application, the angle between the unmanned vehicle and the current road surface can be optimized according to the images of at least two parallel lines of the unmanned vehicle and the current road surface, and then the optimal running between the unmanned vehicle and the current road surface can be obtained.
According to the application, according to the images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle, the included angle between the unmanned vehicle and the current driving road surface is optimized, and the optimal value of the included angle between the unmanned vehicle and the current driving road surface can be: according to images of at least two parallel lane lines of a current driving road surface of the unmanned vehicle, calculating coordinates of any point on the at least two parallel lane lines under a road coordinate system, wherein the coordinates of the any point under the road coordinate system comprise an x-axis component p x and a z-axis component pz of the any point under the road coordinate system; and calculating an optimal value of an included angle between the unmanned vehicle and the current running road surface of the unmanned vehicle according to the x-axis component p x and the z-axis component p z of any point in the road coordinate system. In the above embodiment, taking an arbitrary point p on at least two parallel lanes as an example, it is assumed that its coordinate in the road coordinate system is (p x,py,pz). Since the p-point is on the road surface of the road, the y-axis component of the coordinates of the road coordinate system, i.e., p y =h, where h is the mounting height of the camera relative to the road surface, is a known quantity. As an embodiment of the present application, according to an image of at least two parallel lane lines of a road surface on which an unmanned vehicle is currently traveling, the coordinates of any point on the at least two parallel lane lines under a road coordinate system may be: acquiring coordinates of any p points on at least two parallel lane lines under a pixel coordinate system according to images of at least two parallel lane lines of a current driving road surface of the unmanned vehicle; according to the formulaAnd calculating coordinates of any p points on at least two parallel lanes under a road coordinate system. In the above embodiment, K is an internal reference matrix of the camera, d is a pixel depth, camRRoad is a conversion matrix between the camera coordinate system and the road coordinate system, and K and d are both known quantities.
Since after the images of at least two parallel lanes of the current driving road surface of the unmanned vehicle are determined, the coordinates (u, v) of any p point on the two parallel lanes under the pixel coordinate system are also determinedP y in (a) is a known quantity, and formulaWherein CamRRoad = CamRCar × CarRRoad, wherein CamRCar is a transformation matrix between a camera coordinate system and a vehicle coordinate system, carRRoad is a transformation matrix between a vehicle coordinate system (the vehicle coordinate system is the coordinate system where the unmanned vehicle is located) and a road coordinate system, and CarRRoad is a dependent variable of an included angle between the unmanned vehicle and a current driving road surface, that is,
Since the transformation matrix, camRCar, between the camera coordinate system and the vehicle coordinate system can be calibrated in advance, in particular CarRRoad =e (identity matrix) when the vehicle is stationary and perfectly parallel to the road surface, then according to CamRRoad = CamRCar × CarRRoad, camRRoad = CamRCar; and (3) placing the calibration tool on a road surface for calibration, wherein the obtained calibration result CamRRoad is CamRCar. Thus, from the above analysis, it can be seen that when CarRRoad is substituted into the formulaAnd calculating coordinates of any p points on at least two parallel lanes under a road coordinate system, wherein the coordinates are actually related to alpha, namely an included angle between a vehicle and a road surface, or are dependent variables of alpha. The fact is the basis for optimizing the angle alpha between the unmanned vehicle and the current driving road surface. In other words, in the initial state, when the angle α=0 between the vehicle and the current driving road surface is set, i.e., using CamRCar and CarRRoad =e (identity matrix) calibrated when the vehicle is stationary and perfectly parallel to the road surface, the equation/>The calculated spatial coordinates of the lane lines in fig. 3 are not parallel to each other. At the moment, the included angle alpha between the unmanned vehicle and the current running road surface is used as an optimization target, and the optimal value of alpha is obtained by optimizing the included angle between the unmanned vehicle and the current running road surface, so that the optimal value of alpha passes through the formula/>The two lane lines of the lane lines in fig. 3 are parallel.
After calculating the coordinates of any point on at least two parallel lanes in the road coordinate system, as an embodiment of the present application, according to the x-axis component p x and the z-axis component p z of any point in the road coordinate system, the calculation of the optimal value of the included angle between the unmanned vehicle and the current driving road surface of the unmanned vehicle may be: substituting an x-axis component p x and a z-axis component p z of any point in a road coordinate system into at least two straight-line equations corresponding to parallel lane lines respectively as an unknown quantity x and an unknown quantity z of the straight-line equations corresponding to the at least two parallel lane lines; and adopting a least square method to calculate the sum of error values of any point in a linear equation, and taking the included angle between the unmanned vehicle and the current running road surface as the optimal value of the included angle between the unmanned vehicle and the current running road surface when the sum of the error values is minimum.
In an ideal state, since the i-th and i+1-th lane lines are parallel, when an x-axis component and a z-axis component of an arbitrary point p on the i-th lane line are substituted into a straight line equation corresponding to the i-th lane line in the road coordinate system, and an x-axis component and a z-axis component of an arbitrary point p on the i-th lane line are substituted into a straight line equation corresponding to the i+1-th lane line in the road coordinate system, the sum of the two is 0. However, as described above, since the angle α between the unmanned vehicle and the current driving surface is not necessarily 0, the i-th lane line and the i+1-th lane line are not parallel any more as seen by the camera, and the sum of the two is not 0, that is, an error is generated. In this case, the following loss function may be established:
In the loss function, m represents the number of lane lines, and n represents the number of points on the lane lines. Let the corresponding straight line equation of the i-th lane line be x+bz+c i =0, and the coordinate of the j-th point (p ij) on the i-th lane line in the road coordinate system be (x ij,yij,zij), where y ij is a known quantity, and x ij and z ij can be obtained according to the method of the above embodiment. Substituting p ij point in the road coordinate system, x ij as an x-axis component and z ij as an unknown x and an unknown z as an equation of a straight line corresponding to the at least two parallel lane lines, respectively, into the equations of a straight line corresponding to the at least two parallel lane lines, x+bz+c 1=0、x+bz+c2=0...、x+bz+ci =0, and x+bz+c m =0. Since the coordinates of any p point on at least two parallel lanes in the road coordinate system are the dependent variables of alpha, the loss function E 0 is actually a function of the included angle alpha between the unmanned vehicle and the current driving road surface, so alpha can be used as an optimization target for optimization, and the sum E 0 of the error values of the p ij point in m linear equations can be obtained by adopting a least square method, namely And taking the included angle between the unmanned vehicle and the current running road surface when the sum of the error values is minimum as the optimal value of the included angle alpha between the unmanned vehicle and the current running road surface. It should be noted that, when the sum of error values E 0 is minimum, which does not mean that E 0 is zero, a threshold may be preset, and when the sum of error values E 0 is smaller than the preset threshold, it may be considered that the sum of error values E 0 is minimum, and the algorithm may stop iterating.
Step S104: and subtracting the sum of the optimal value and the gradient value from the pitch angle detection value of the unmanned vehicle when the included angle between the unmanned vehicle and the current driving road surface takes the optimal value, so as to obtain the deviation value between the pitch angle detection value of the unmanned vehicle and the pitch angle true value of the unmanned vehicle.
Note that the optimum value of the angle α between the unmanned vehicle and the current road surface is α optimal, and after steps S101 to S103, pitch optimal-detect and slope optimal when the angle α between the unmanned vehicle and the current road surface is the optimum value α optimal have been obtained, it is known that bias=pitch optimal-detect-(slopeoptimaloptimal according to the above formula 1. Although the bias is obtained when the angle α between the vehicle and the current road surface is the optimal value α optimal, as described above, the value of bias is determined by the installation error of the positioning sensor of the vehicle, regardless of the position of the vehicle, how the road condition is, and whether the angle α between the vehicle and the current road surface is the optimal value α optimal, and therefore, bias=pitch optimal-detect-(slopeoptimaloptimal obtained by the above formula 1) is also the deviation value between the pitch angle detection value of the vehicle and the pitch angle true value of the vehicle.
Step S105: and obtaining the true pitch angle value of the unmanned vehicle according to the pitch angle detection value and the deviation value of the unmanned vehicle at any moment.
Specifically, according to the pitch angle detection value and the deviation value of the unmanned vehicle at any moment, the pitch angle true value of the unmanned vehicle can be obtained by: detecting and obtaining a pitch angle detection value of the unmanned aerial vehicle at any moment through a positioning sensor carried by the unmanned aerial vehicle; and subtracting the deviation value from the pitch angle detection value of the unmanned vehicle at any moment to obtain the true pitch angle value of the unmanned vehicle. Note that the pitch angle detection value of the unmanned vehicle at any time detected by the positioning sensor mounted on the unmanned vehicle is pitch detect, and the pitch angle true value of the unmanned vehicle is pitch real, so that pitch real=pitchdetect-bias=pitchdetect--(pitchoptimal-detect-slopeoptimaloptimal). It is to be noted that, as shown in fig. 4, since the unmanned vehicle and the camera mounted on the unmanned vehicle are fixed relatively, when the pitch angle true value of the unmanned vehicle is obtained through steps S101 to S105, the pitch angle true value of the camera mounted on the unmanned vehicle can be obtained from the fixed relationship between the unmanned vehicle and the camera mounted on the unmanned vehicle.
As can be seen from the above-mentioned method for determining the pitch angle of the unmanned vehicle illustrated in fig. 1, on one hand, since the current driving road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently driving, the technical solution of the embodiment of the present application does not need a specific location when implemented, and can be performed at any location, or most locations can meet the conditions; on the other hand, whether the images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle are acquired, the first positioning point and the second positioning point of the current driving road surface of the unmanned vehicle are positioned, or the gradient value of the current driving road surface of the unmanned vehicle is obtained according to the images, compared with the prior art that the algorithm is complex and the calculated amount is large because a large amount of point cloud data are required to be acquired by the laser radar, the method and the system have the advantages that the included angle between the unmanned vehicle and the current driving road surface is optimized through relatively less calculated amount, the required calculated amount is less in the process of obtaining the deviation value between the pitch angle detection value of the unmanned vehicle and the pitch angle true value of the unmanned vehicle according to the calculated amount, and the like.
Corresponding to the embodiment of the application function implementation method, the application also provides an automatic driving vehicle pitch angle determining device, electronic equipment and corresponding embodiments.
Referring to fig. 5, a schematic structural view of an autonomous vehicle pitch angle determining apparatus according to an embodiment of the present application is shown. For convenience of explanation, only portions relevant to the embodiments of the present application are shown. The apparatus for determining the pitch angle of an automatically driven vehicle illustrated in fig. 5 may be applied to intelligent driving, and mainly includes a first obtaining module 501, a second obtaining module 502, an optimizing module 503, a calculating module 504, and a third obtaining module 505, where:
The first obtaining module 501 is configured to respectively locate a first locating point and a second locating point of a current running road surface of the unmanned vehicle, and obtain a gradient value of the current running road surface of the unmanned vehicle, where the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle currently runs;
the second obtaining module 502 is configured to obtain images of at least two parallel lane lines on a current driving road surface of the unmanned vehicle;
The optimizing module 503 is configured to optimize an included angle between the unmanned vehicle and the current driving road surface according to images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle, so as to obtain an optimal value of the included angle between the unmanned vehicle and the current driving road surface;
the calculating module 504 is configured to subtract the sum of the optimal value and the gradient value from the pitch angle detection value of the unmanned vehicle when the angle between the unmanned vehicle and the current driving road surface takes the optimal value, so as to obtain a deviation value between the pitch angle detection value of the unmanned vehicle and the pitch angle true value of the unmanned vehicle;
And the third obtaining module 505 is configured to obtain a pitch angle true value of the unmanned vehicle according to the pitch angle detection value and the deviation value of the unmanned vehicle at any time.
The specific manner in which the respective modules perform the operations in the apparatus of the above embodiments has been described in detail in the embodiments related to the method, and will not be described in detail herein.
Optionally, the optimization module 503 illustrated in fig. 5 may include a first calculation unit and a second calculation unit, where:
The first calculation unit is used for calculating the coordinates of any point on at least two parallel lane lines of the current running road surface of the unmanned vehicle under a road coordinate system according to the images of the at least two parallel lane lines of the current running road surface of the unmanned vehicle, wherein the coordinates of any point under the road coordinate system comprise an x-axis component p x and a z-axis component pz of any point under the road coordinate system;
And the second calculation unit is used for calculating the optimal value of the included angle between the unmanned vehicle and the current driving road surface according to the x-axis component p x and the z-axis component p z of any point in the road coordinate system.
Alternatively, the first computing unit of the above example may include a first acquiring unit and a third computing unit, wherein:
The first acquisition unit is used for acquiring coordinates of any point on at least two parallel lane lines of the current driving road surface of the unmanned vehicle under a pixel coordinate system according to images of the at least two parallel lane lines of the current driving road surface of the unmanned vehicle;
A third calculation unit for calculating according to the formula Calculating coordinates of any point on at least two parallel lanes of the current driving road surface of the unmanned vehicle under a road coordinate system, wherein u and v are coordinates of any point under a pixel coordinate system, K is an internal reference matrix of the unmanned vehicle-mounted camera, d is pixel depth of any point, and v is a pixel depth of any pointP x、py and p z are x, y and z axis components of coordinates of any point in a road coordinate system, and p y =h, h is the mounting height of the camera relative to the road surface, camRRoad is a conversion matrix between the camera coordinate system and the road coordinate system, and the camera coordinate system is the coordinate system of the unmanned vehicle-mounted camera.
Optionally, camRRoad = CamRCar × CarRRoad, camRCar in the above example is a transformation matrix between a camera coordinate system and a vehicle coordinate system, carRRoad is a transformation matrix between the vehicle coordinate system and a road coordinate system, carRRoad is a dependent variable of an angle between the unmanned vehicle and a current driving road surface, and the vehicle coordinate system is a coordinate system where the unmanned vehicle is located.
Optionally, the second computing unit of the above example may include a fourth computing unit and a tuning unit, wherein:
The fourth calculation unit is used for substituting the x-axis component p x and the z-axis component p z of any point in the road coordinate system into at least two parallel lane line corresponding straight line equations as an unknown quantity x and an unknown quantity z of the at least two parallel lane line corresponding straight line equations respectively;
And the optimizing unit is used for obtaining the sum of error values of any point in the linear equation by adopting a least square method, and taking the included angle between the unmanned vehicle and the current running road surface as the optimal value of the included angle between the unmanned vehicle and the current running road surface when the sum of the error values is minimum.
Optionally, the first obtaining module 501 illustrated in fig. 5 may include a first detecting unit, a fifth calculating unit, and a sixth calculating unit, where:
The first detection unit is used for detecting and obtaining the height of a first locating point and the height of a second locating point of the current running road surface of the unmanned vehicle and the coordinates of the first locating point and the coordinates of the second locating point through the locating system;
A fifth calculation unit, configured to calculate a distance between the first positioning point and the second positioning point according to the coordinates of the first positioning point and the coordinates of the second positioning point;
and the sixth calculation unit is used for calculating the gradient value of the current running road surface of the unmanned vehicle according to the height of the first locating point, the height of the second locating point and the distance between the first locating point and the second locating point based on the European geometry principle.
Optionally, the third acquisition module 505 illustrated in fig. 5 may include a second detection unit and a seventh calculation unit, where:
the second detection unit is used for detecting and obtaining a pitch angle detection value of the unmanned vehicle at any moment through a positioning sensor carried by the unmanned vehicle;
And the seventh calculation unit is used for subtracting the deviation value from the pitch angle detection value of the unmanned vehicle at any moment to obtain the true pitch angle value of the unmanned vehicle.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Referring to fig. 6, an electronic device 600 includes a memory 610 and a processor 620.
The Processor 620 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Memory 610 may include various types of storage units, such as system memory, read Only Memory (ROM), and persistent storage. Where the ROM may store static data or instructions that are required by the processor 620 or other modules of the computer. The persistent storage may be a readable and writable storage. The persistent storage may be a non-volatile memory device that does not lose stored instructions and data even after the computer is powered down. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the persistent storage may be a removable storage device (e.g., diskette, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as dynamic random access memory. The system memory may store instructions and data that are required by some or all of the processors at runtime. Furthermore, memory 610 may include any combination of computer-readable storage media including various types of semiconductor memory chips (e.g., DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic disks, and/or optical disks may also be employed. In some implementations, memory 610 may include readable and/or writable removable storage devices such as Compact Discs (CDs), digital versatile discs (e.g., DVD-ROMs, dual layer DVD-ROMs), blu-ray discs read only, super-density discs, flash memory cards (e.g., SD cards, min SD cards, micro-SD cards, etc.), magnetic floppy disks, and the like. The computer readable storage medium does not contain a carrier wave or an instantaneous electronic signal transmitted by wireless or wired transmission.
The memory 610 has stored thereon executable code that, when processed by the processor 620, can cause the processor 620 to perform some or all of the methods described above.
Furthermore, the method according to the application may also be implemented as a computer program or computer program product comprising computer program code instructions for performing part or all of the steps of the above-described method of the application.
Or the application may also be embodied as a computer-readable storage medium (or non-transitory machine-readable storage medium or machine-readable storage medium) having stored thereon executable code (or a computer program or computer instruction code) which, when executed by a processor of an electronic device (or server, etc.), causes the processor to perform some or all of the steps of the above-described method according to the application.
The foregoing description of embodiments of the application has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (9)

1. A method of determining a pitch angle of an unmanned vehicle, the method comprising:
Respectively positioning a first positioning point and a second positioning point of a current running road surface of the unmanned vehicle to obtain a gradient value of the current running road surface of the unmanned vehicle, wherein the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently running; the unmanned vehicle is positioned at any position on any road surface;
acquiring images of at least two parallel lane lines of a current driving road surface of the unmanned vehicle;
Calculating coordinates of any point on at least two parallel lane lines of a current driving road surface of the unmanned vehicle under a road coordinate system according to images of at least two parallel lane lines, wherein the coordinates of the any point on the at least two parallel lane lines under the road coordinate system comprise an x-axis component p x and a z-axis component p z of the any point under the road coordinate system, and calculating an optimal value of an included angle between the unmanned vehicle and the current driving road surface according to the x-axis component p x and the z-axis component p z of the any point under the road coordinate system;
subtracting the sum of the optimal value and the gradient value from a pitch angle detection value of the unmanned vehicle when the included angle between the unmanned vehicle and the current driving road surface takes the optimal value to obtain a deviation value between the pitch angle detection value of the unmanned vehicle and a pitch angle true value of the unmanned vehicle;
And obtaining a pitch angle true value of the unmanned vehicle according to the pitch angle detection value of the unmanned vehicle at any time and the deviation value.
2. The unmanned vehicle pitch angle determining method according to claim 1, wherein the calculating coordinates of any point on at least two parallel lanes on the road coordinate system according to the image of the at least two parallel lanes on the current driving road surface of the unmanned vehicle comprises:
Acquiring coordinates of any point on at least two parallel lane lines under a pixel coordinate system according to images of at least two parallel lane lines of a current driving road surface of the unmanned vehicle;
According to the formula Calculating coordinates of any point on the at least two parallel lanes under a road coordinate system, wherein u and v are coordinates of the any point under a pixel coordinate system, K is an internal reference matrix of the unmanned vehicle-mounted camera, d is pixel depth of the any point, and/>The P x、Py and P z are x, y and z axis components of the coordinates of the arbitrary point in a road coordinate system, and the P y =h, where h is the installation height of the camera relative to the road surface, camRRoad is a transformation matrix between a camera coordinate system and the road coordinate system, and the camera coordinate system is a coordinate system where the camera mounted on the unmanned vehicle is located.
3. The unmanned aerial vehicle pitch angle determination method of claim 2, wherein CamRRoad = CamRCar x CarRRoad, wherein CamRCar is a transformation matrix between a camera coordinate system and a vehicle coordinate system, wherein CarRRoad is a transformation matrix between the vehicle coordinate system and a road coordinate system, and wherein CarRRoad is a dependent variable of an angle between the unmanned aerial vehicle and the current driving road surface, and wherein the vehicle coordinate system is the coordinate system in which the unmanned aerial vehicle is located.
4. The method according to claim 1, wherein calculating an optimal value of an angle between the vehicle and the current road surface based on coordinates of the arbitrary point in a road coordinate system including an x-axis component P x and a z-axis component P z of the arbitrary point in the road coordinate system includes:
Substituting an x-axis component P x and a z-axis component P z of the arbitrary point in a road coordinate system into the at least two parallel lane line corresponding straight line equations as an unknown quantity x and an unknown quantity z of the at least two parallel lane line corresponding straight line equations respectively;
And obtaining the sum of error values of the arbitrary points in the linear equation by adopting a least square method, and taking the included angle between the unmanned vehicle and the current driving road surface as the optimal value of the included angle between the unmanned vehicle and the current driving road surface when the sum of the error values is minimum.
5. The method for determining a pitch angle of an unmanned vehicle according to claim 1, wherein the positioning the first positioning point and the second positioning point of the current driving road surface of the unmanned vehicle respectively, and obtaining the gradient value of the current driving road surface of the unmanned vehicle, comprises:
detecting and obtaining the height of a first locating point and the height of a second locating point of the current running road surface of the unmanned vehicle and the coordinates of the first locating point and the coordinates of the second locating point through a locating system;
calculating the distance between the first locating point and the second locating point according to the coordinates of the first locating point and the coordinates of the second locating point;
Based on the European geometry principle, calculating the gradient value of the current running road surface of the unmanned vehicle according to the height of the first locating point, the height of the second locating point and the distance between the first locating point and the second locating point.
6. The method for determining the pitch angle of the unmanned aerial vehicle according to claim 1, wherein the obtaining the true pitch angle value of the unmanned aerial vehicle according to the pitch angle detection value of the unmanned aerial vehicle at any time and the deviation value comprises:
detecting and obtaining a pitch angle detection value of the unmanned vehicle at any moment through a positioning sensor carried by the unmanned vehicle;
and subtracting the deviation value from the pitch angle detection value of the unmanned aerial vehicle at any moment to obtain the true pitch angle value of the unmanned aerial vehicle.
7. An autonomous vehicle pitch angle determination apparatus, the apparatus comprising:
the first acquisition module is used for respectively positioning a first positioning point and a second positioning point of a current running road surface of the unmanned vehicle to acquire a gradient value of the current running road surface of the unmanned vehicle, wherein the current running road surface of the unmanned vehicle is any road surface on which the unmanned vehicle is currently running; the unmanned vehicle is positioned at any position on any road surface;
the second acquisition module is used for acquiring images of at least two parallel lane lines of the current driving road surface of the unmanned vehicle;
the optimization module is used for calculating the coordinates of any point on at least two parallel lane lines under a road coordinate system according to the images of at least two parallel lane lines of the current running road surface of the unmanned vehicle, wherein the coordinates of the any point under the road coordinate system comprise an x-axis component p x and a z-axis component p z of the any point under the road coordinate system, and calculating the optimal value of the included angle between the unmanned vehicle and the current running road surface according to the x-axis component p x and the z-axis component p z of the any point under the road coordinate system;
The calculation module is used for subtracting the sum of the optimal value and the gradient value from the pitch angle detection value of the unmanned vehicle when the included angle between the unmanned vehicle and the current driving road surface takes the optimal value to obtain a deviation value between the pitch angle detection value of the unmanned vehicle and the pitch angle true value of the unmanned vehicle;
And the third acquisition module is used for obtaining the true pitch angle value of the unmanned vehicle according to the pitch angle detection value of the unmanned vehicle at any moment and the deviation value.
8. An electronic device, comprising:
A processor; and
A memory having executable code stored thereon, which when executed by the processor causes the processor to perform the method of any of claims 1 to 6.
9. A computer-readable storage medium, characterized by: executable code stored thereon, which when executed by a processor of an electronic device causes the processor to perform the method of any of claims 1 to 6.
CN202111611135.8A 2021-12-27 2021-12-27 Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium Active CN114248782B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111611135.8A CN114248782B (en) 2021-12-27 2021-12-27 Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111611135.8A CN114248782B (en) 2021-12-27 2021-12-27 Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN114248782A CN114248782A (en) 2022-03-29
CN114248782B true CN114248782B (en) 2024-06-07

Family

ID=80798023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111611135.8A Active CN114248782B (en) 2021-12-27 2021-12-27 Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114248782B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003097945A (en) * 2001-09-20 2003-04-03 Mitsubishi Motors Corp Road surface slope estimating device
JP2009276109A (en) * 2008-05-13 2009-11-26 National Traffic Safety & Environment Laboratory Device and method for measuring gradient in road
CN105339226A (en) * 2013-05-02 2016-02-17 通用汽车环球科技运作有限责任公司 Integrated grade and pitch estimation using a three-axis inertial-measuring device
JP2017090159A (en) * 2015-11-06 2017-05-25 株式会社日本自動車部品総合研究所 Vehicle pitch angle estimation device
DE102017217271A1 (en) * 2017-09-28 2019-03-28 Bayerische Motoren Werke Aktiengesellschaft Method for determining the pitch angle of a motor vehicle
CN109900254A (en) * 2019-03-28 2019-06-18 合肥工业大学 A kind of the road gradient calculation method and its computing device of monocular vision
CN111060071A (en) * 2019-12-16 2020-04-24 中公高科养护科技股份有限公司 Road slope measuring method and system
CN112862890A (en) * 2021-02-07 2021-05-28 黑芝麻智能科技(重庆)有限公司 Road gradient prediction method, road gradient prediction device and storage medium
CN113442932A (en) * 2021-07-28 2021-09-28 广州小鹏汽车科技有限公司 Method, apparatus, vehicle, and computer-readable storage medium for estimating road surface gradient

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003097945A (en) * 2001-09-20 2003-04-03 Mitsubishi Motors Corp Road surface slope estimating device
JP2009276109A (en) * 2008-05-13 2009-11-26 National Traffic Safety & Environment Laboratory Device and method for measuring gradient in road
CN105339226A (en) * 2013-05-02 2016-02-17 通用汽车环球科技运作有限责任公司 Integrated grade and pitch estimation using a three-axis inertial-measuring device
JP2017090159A (en) * 2015-11-06 2017-05-25 株式会社日本自動車部品総合研究所 Vehicle pitch angle estimation device
DE102017217271A1 (en) * 2017-09-28 2019-03-28 Bayerische Motoren Werke Aktiengesellschaft Method for determining the pitch angle of a motor vehicle
CN109900254A (en) * 2019-03-28 2019-06-18 合肥工业大学 A kind of the road gradient calculation method and its computing device of monocular vision
CN111060071A (en) * 2019-12-16 2020-04-24 中公高科养护科技股份有限公司 Road slope measuring method and system
CN112862890A (en) * 2021-02-07 2021-05-28 黑芝麻智能科技(重庆)有限公司 Road gradient prediction method, road gradient prediction device and storage medium
CN113442932A (en) * 2021-07-28 2021-09-28 广州小鹏汽车科技有限公司 Method, apparatus, vehicle, and computer-readable storage medium for estimating road surface gradient

Also Published As

Publication number Publication date
CN114248782A (en) 2022-03-29

Similar Documents

Publication Publication Date Title
CN111947671B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN110673115B (en) Combined calibration method, device, equipment and medium for radar and integrated navigation system
US10634777B2 (en) Radar odometry for vehicle
CN111476106B (en) Monocular camera-based straight road relative gradient real-time prediction method, system and device
US20180150976A1 (en) Method for automatically establishing extrinsic parameters of a camera of a vehicle
CN108759823B (en) Low-speed automatic driving vehicle positioning and deviation rectifying method on designated road based on image matching
AU2018282302A1 (en) Integrated sensor calibration in natural scenes
CN114034307B (en) Vehicle pose calibration method and device based on lane lines and electronic equipment
CN110969055B (en) Method, apparatus, device and computer readable storage medium for vehicle positioning
JP7113134B2 (en) vehicle controller
CN111060946A (en) Method and apparatus for estimating position
CN114088114B (en) Vehicle pose calibration method and device and electronic equipment
CN114241062A (en) Camera external parameter determination method and device for automatic driving and computer readable storage medium
US11983890B2 (en) Method and apparatus with motion information estimation
CN113945937A (en) Precision detection method, device and storage medium
CN114973198A (en) Course angle prediction method and device of target vehicle, electronic equipment and storage medium
CN114248782B (en) Unmanned vehicle pitch angle determination method, unmanned vehicle pitch angle determination device and computer readable storage medium
CN112016366A (en) Obstacle positioning method and device
CN116052117A (en) Pose-based traffic element matching method, equipment and computer storage medium
CN115308785A (en) Unmanned vehicle autonomous positioning method based on multi-sensor fusion
CN113763483B (en) Method and device for calibrating pitch angle of automobile data recorder
WO2019091381A1 (en) Method and device for identifying stereoscopic object, and vehicle and storage medium
US20240221219A1 (en) Method and device for calibrating a camera mounted on a vehicle
JP2020059332A (en) Position estimation device and position estimation method
CN113538546B (en) Target detection method, device and equipment for automatic driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant