CN109345593B - Camera posture detection method and device - Google Patents

Camera posture detection method and device Download PDF

Info

Publication number
CN109345593B
CN109345593B CN201811025833.8A CN201811025833A CN109345593B CN 109345593 B CN109345593 B CN 109345593B CN 201811025833 A CN201811025833 A CN 201811025833A CN 109345593 B CN109345593 B CN 109345593B
Authority
CN
China
Prior art keywords
camera
lane
image
candidate
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811025833.8A
Other languages
Chinese (zh)
Other versions
CN109345593A (en
Inventor
高语函
吴风炎
曲磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Co Ltd
Original Assignee
Hisense Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Co Ltd filed Critical Hisense Co Ltd
Priority to CN201811025833.8A priority Critical patent/CN109345593B/en
Publication of CN109345593A publication Critical patent/CN109345593A/en
Application granted granted Critical
Publication of CN109345593B publication Critical patent/CN109345593B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application provides a method and a device for detecting the posture of a camera, wherein the method comprises the following steps: determining the coordinate position of a vanishing point of a lane line in the lane image in an image coordinate system; wherein the vanishing point is the intersection point of the two lane lines; determining candidate pitch angles and candidate yaw angles of the camera based on the coordinate position and preset camera shooting parameters; and calculating the confidence rates of the candidate pitch angles and the candidate yaw angles determined by the specified number of lane images, determining the candidate pitch angle with the highest confidence rate as the actual pitch angle of the camera, and determining the candidate yaw angle with the highest confidence rate as the actual yaw angle of the camera. According to the technical scheme, candidate pitch angles and candidate yaw angles of the camera are determined according to the coordinate position of the vanishing point of the lane line in the lane image and the camera shooting parameters, and then the actual pitch angles and the actual yaw angles of the camera are determined based on the specified number of lane images, so that the posture of the camera is determined according to the pitch angles and the yaw angles.

Description

Camera posture detection method and device
Technical Field
The present disclosure relates to the field of image processing, and in particular, to a method and an apparatus for detecting a camera pose.
Background
The lane line detection is to acquire a lane image by using a camera installed in front of the vehicle, extract lane pixel feature points in the image, and obtain lane information in front of the vehicle after fitting the lane pixel feature points. And determining whether the current vehicle deviates from the lane by judging the distance between the vehicle and lane lines on two sides of the lane in the lane information.
Under normal conditions, the optical axis of the camera is parallel to the lane line, level ground, and therefore, the lane images captured by the camera can be effectively used for lane line detection.
However, if the posture of the camera is not adjusted when the camera is installed, or if the posture of the camera is changed due to an external force (for example, an external force generated by vehicle bump) during the driving of the vehicle, an included angle may exist between the optical axis of the camera and the lane line and/or the horizontal ground. When the posture of the camera is not correct, the detection result of the lane lines can be influenced, so that the actual distance between the judged vehicle and the lane lines on the two sides is inaccurate.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for detecting a camera pose, which are used to detect the camera pose so as to facilitate subsequent adjustment, thereby avoiding an influence caused by the fact that the pose is not directly facing a lane line detection.
Specifically, the method is realized through the following technical scheme:
a method of detecting camera pose, comprising:
determining the coordinate position of a vanishing point of a lane line in the lane image in an image coordinate system; wherein the vanishing point is the intersection point of the two lane lines;
determining candidate pitch angles and candidate yaw angles of the camera based on the coordinate position of the vanishing point and preset camera parameters;
and calculating the confidence rates of the candidate pitch angles and the candidate yaw angles determined by the specified number of lane images, determining the candidate pitch angle with the highest confidence rate as the actual pitch angle of the camera, and determining the candidate yaw angle with the highest confidence rate as the actual yaw angle of the camera.
In the method for detecting a camera pose, the method further comprises:
and performing attitude adjustment on the camera based on the determined actual pitch angle and the actual yaw angle.
In the method for detecting the camera pose, the determining the coordinate position of the vanishing point of the lane line in the lane image in the image coordinate system includes:
acquiring measurement parameters of a vehicle-mounted sensor, and determining whether the current road condition meets a preset lane line detection rule or not based on the measurement parameters;
if so, determining whether the difference between the lane line parameter in the current lane image and the lane line parameter in the previous lane image is smaller than a preset ratio threshold value;
and if so, determining the coordinate position of the vanishing point of the lane line in the image coordinate system from the current lane image.
In the camera attitude detection method, the camera parameters include a coordinate position of an image center of the lane image in an image coordinate system, a focal length of a camera, and a unit pixel size of the lane image;
the determining candidate pitch angle and candidate yaw angle of the camera based on the coordinate position of the vanishing point and the preset shooting parameter comprises the following steps:
calculating a candidate pitch angle β of the camera based on a first formula:
Figure BDA0001788498420000021
wherein u is1Is the abscissa, u, of the vanishing point0The abscissa of the image center of the lane image is adopted, dx is the unit pixel size of the lane image in the abscissa direction, and f is the focal length of the camera;
calculating a candidate yaw angle α of the camera based on a second formula:
Figure BDA0001788498420000022
wherein v is1Is the ordinate, v, of the vanishing point0And f is the focal length of the camera, and beta is the candidate pitch angle of the camera.
In the method for detecting a camera pose, the method further comprises:
and determining the actual distance between the vehicle and the lane lines on the two sides based on the determined actual pitch angle and the actual deviation angle.
A camera pose detection apparatus comprising:
the first determining unit is used for determining the coordinate position of a vanishing point of the lane line in the lane image in an image coordinate system; wherein the vanishing point is the intersection point of the two lane lines;
the computing unit is used for determining candidate pitch angles and candidate yaw angles of the camera based on the coordinate position of the vanishing point and preset camera shooting parameters;
and the second determining unit is used for calculating the confidence rates of the candidate pitch angles and the candidate yaw angles determined by the specified number of lane images, determining the candidate pitch angle with the highest confidence rate as the actual pitch angle of the camera, and determining the candidate yaw angle with the highest confidence rate as the actual yaw angle of the camera.
In the detection apparatus of the camera pose, the apparatus further includes:
and the adjusting unit is used for carrying out attitude adjustment on the camera based on the determined actual pitch angle and the actual yaw angle.
In the detection apparatus of the camera pose, the first determination unit is further configured to:
acquiring measurement parameters of a vehicle-mounted sensor, and determining whether the current road condition meets a preset lane line detection rule or not based on the measurement parameters;
if so, determining whether the difference between the lane line parameter in the current lane image and the lane line parameter in the previous lane image is smaller than a preset ratio threshold value;
and if so, determining the coordinate position of the vanishing point of the lane line in the image coordinate system from the current lane image.
In the camera attitude detection device, the imaging parameters include a coordinate position of an image center of the lane image in an image coordinate system, a focal length of a camera, and a unit pixel size of the lane image;
the computing unit is further configured to:
calculating a candidate pitch angle β of the camera based on a first formula:
Figure BDA0001788498420000041
wherein u is1Is the abscissa, u, of the vanishing point0The abscissa of the image center of the lane image is adopted, dx is the unit pixel size of the lane image in the abscissa direction, and f is the focal length of the camera;
calculating a candidate yaw angle α of the camera based on a second formula:
Figure BDA0001788498420000042
wherein v is1Is the ordinate, v, of the vanishing point0And f is the focal length of the camera, and beta is the candidate pitch angle of the camera.
In the detection apparatus of the camera pose, the apparatus further includes:
and a third determining unit for determining the actual distance between the vehicle and the lane lines on both sides based on the determined actual pitch angle and the actual offset angle.
In the embodiment of the application, the coordinate position of a vanishing point of a lane line in a lane image in an image coordinate system is determined, then candidate pitch angles and candidate yaw angles of a camera are determined based on the coordinate position of the vanishing point and preset camera shooting parameters, the confidence rates of the candidate pitch angles and the confidence rates of the candidate yaw angles determined by a specified number of lane images are calculated, the candidate pitch angle with the highest confidence rate is determined as the actual pitch angle of the camera, and the candidate yaw angle with the highest confidence rate is determined as the actual yaw angle of the camera;
candidate pitch angles and candidate yaw angles of the camera can be determined through coordinate positions of vanishing points of lane lines in lane images and camera shooting parameters, and then actual pitch angles and actual yaw angles of the camera are determined based on a specified number of lane images, so that the attitude of the camera is determined according to the pitch angles and the yaw angles.
Drawings
FIG. 1 is a schematic diagram of an image coordinate system shown in the present application;
FIG. 2 is a schematic view of a camera coordinate system shown in the present application;
FIG. 3 is a flow chart of a camera pose detection method shown in the present application;
FIG. 4 is a schematic view of one lane line variation shown in the present application;
FIG. 5 is a schematic view of another lane line variation shown in the present application;
FIG. 6 is a block diagram of an embodiment of a camera pose detection apparatus shown in the present application;
fig. 7 is a hardware configuration diagram of a camera attitude detection device according to the present application.
Detailed Description
In order to make the technical solutions in the embodiments of the present invention better understood and make the above objects, features and advantages of the embodiments of the present invention more comprehensible, the following description of the prior art and the technical solutions in the embodiments of the present invention with reference to the accompanying drawings is provided.
In the process of detecting the camera posture, the embodiment of the application relates to the relevant content of the space geometric relationship of the camera imaging. To more clearly illustrate the principle of the detection means in the present application, first three coordinate systems related to the imaging of the camera are described. Wherein the three coordinate systems include an image coordinate system, a camera coordinate system, and a world coordinate system.
Referring to fig. 1, a schematic diagram of an image coordinate system is shown in the present application. As shown in fig. 1, the image coordinate system establishes a rectangular coordinate system u-v with the upper left corner of the image as an origin O0, where the rectangular coordinate system u-v is in units of pixels, and the abscissa u and the ordinate v of any pixel in the image are the number of columns and rows in the image, respectively.
Since the u-v coordinates only represent the number of columns and rows of pixels and the positions of the pixels in the image are expressed in physical units, an image coordinate system X-Y expressed in physical units (for example, the physical units may be millimeters) is also established in the image plane.
Generally, the intersection of the optical axis of the camera and the image plane, which is generally located at the center of the image plane, may be defined as the origin O1 of the X-Y coordinate system. The X-axis is parallel to the u-axis, and the Y-axis is parallel to the v-axis. Let (u0, v0) represent the coordinates of O1 in a u-v coordinate system, and dx and dy represent the physical dimensions of each pixel in the X-axis direction and the Y-axis direction, respectively. Each pixel in the image may be scaled between its coordinates in the u-v coordinate system and its coordinates in the X-Y coordinate system, as described in more detail below.
Referring to fig. 2, a schematic diagram of a camera coordinate system is shown in the present application. As shown in fig. 2, the point of origin O of the camera coordinate system is the camera spot (projection center), and the Xc axis and Yc axis of the camera coordinate system are parallel to the X axis and Y axis of the X-Y coordinate system of the image plane, respectively. The Zc axis of the camera coordinate system coincides with the optical axis of the camera, is perpendicular to the image plane, and intersects the image plane at point O1. The distance O-O1 between the origin of the camera coordinate system and the origin of the image plane X-Y coordinate system is the focal length f of the camera.
The world coordinate system is introduced for describing the position of the camera, and the calculation of the overall position of the camera is not involved in the present application, so the origin of the camera coordinate system is taken as the origin of the world coordinate system, and the camera coordinate system when the camera is in a correct posture is taken as the world coordinate system.
Referring to fig. 3, a flowchart of a camera pose detection method according to the present application is shown, as shown in fig. 3, the method includes the following steps:
step 301: determining the coordinate position of a vanishing point of a lane line in the lane image in an image coordinate system; wherein the vanishing point is an intersection point of the two lane lines.
The method can be applied to an electronic device, and the electronic device can be a camera or a smart device docked with the camera. For convenience of description of the present solution, a camera is used as an implementation subject.
Wherein the camera is mounted at a suitable position in front of the vehicle so that the camera can capture images in front of the vehicle.
In the driving process of the vehicle, the camera can acquire a lane image in front of the vehicle in real time, then lane pixel feature points in the image are extracted through a lane line detection technology, and lane information in front of a lane is obtained after the lane pixel feature points are fitted. As an example, the lane information may be a gray scale map fitted with lane lines on both sides of the lane.
The lane line detection can be realized by methods such as hough transform, least square method, RANSAC, and the like, and specific reference can be made to the prior art, which is not described herein again.
Vehicle on both sides of roadThe lane lines can be seen as two straight lines in the image, and the two lane lines extend forward from near to far in the image, eventually meet at a point, and no longer extend forward. Therefore, the intersection of two lane lines is also referred to as the vanishing point of the lane line. When the road is a straight road and the vehicle faces the front of the road, the camera can be installed at a proper position on the vehicle, so that the vanishing point in the image acquired by the camera is positioned in the center of the image. In other words, the coordinate of the vanishing point in the u-v coordinate system of the image plane is made to be (u)0,v0)。
When the posture of the camera changes, namely an included angle exists between the optical axis of the camera and the lane line, or an included angle exists between the optical axis and the horizontal ground, or included angles exist between the optical axis and the lane line and the horizontal ground respectively, the coordinate position of the vanishing point of the lane line in the image changes.
Referring to fig. 4 and 5, schematic diagrams of two lane line variations shown in the present application are shown.
As shown in fig. 4, when the camera has a pitch angle, if the pitch angle is greater than 0, the lane line moves upward in the lane image, and accordingly, the vanishing point of the lane line also moves upward, and the distance that the lane line and the vanishing point move upward is larger as the pitch angle is larger; if the pitch angle is smaller than 0, the lane line moves downward in the lane image, correspondingly, the vanishing point of the lane line also moves downward, and the smaller the pitch angle is, the larger the distance between the lane line and the vanishing point moving downward is.
As shown in fig. 5, when the camera has a yaw angle, if the yaw angle is greater than 0, the lane line will move to the right in the lane image, and accordingly, the vanishing point of the lane line will also move to the right, and the larger the yaw angle is, the larger the distance between the lane line and the vanishing point to move to the right is; if the yaw angle is smaller than 0, the lane line will move to the left in the lane image, and correspondingly, the vanishing point of the lane line will also move to the left, and the smaller the yaw angle, the larger the distance that the lane line and the vanishing point move to the left.
Based on the principle, the posture angle (including the pitch angle and the offset angle) of the camera is determined according to the coordinate position of the vanishing point in the image. Therefore, after the lane information is obtained by the lane line detection technique, the coordinate position of the vanishing point of the lane line in the lane information (gray scale) can be further determined, which is also the coordinate position of the lane line in the lane image.
In the illustrated embodiment, it is necessary to determine the coordinate position of the vanishing point of the lane line when a specified condition is satisfied, in consideration of the existence of a series of factors affecting the camera attitude detection.
First, the road conditions under which the vehicle is traveling may affect the camera pose detection. When the vehicle travels on a curved road or a sloping road, the lane line is not a straight line, and therefore the position of the vanishing point of the lane line in the lane image also changes.
Secondly, the vehicle driving state may also have an influence on the camera attitude detection. When the vehicle is driven unstably, for example, the vehicle bumps up and down, the imaging position of the lane line in the lane image is also changed, and the position of the vanishing point of the lane line is changed.
Therefore, in order to enable the coordinate position of the vanishing point of the lane line determined in the lane image to be accurately used for detecting the posture of the camera, the following specified conditions need to be met:
firstly, a vehicle runs on a flat straight road;
secondly, the vehicle running state is stable.
In order to meet the specified conditions, the camera can first acquire measurement parameters of a vehicle-mounted sensor (such as a gyroscope), and determine whether the current road condition meets a preset lane line detection rule or not based on the measurement parameters. The lane line detection rule limits the measurement parameter range, and when the measurement parameter of the vehicle-mounted sensor is within the strategy parameter range, the vehicle is indicated to run on a flat straight road. As an example, if the vehicle-mounted sensor is a three-axis gyroscope, the measurement parameter may be angular acceleration of the gyroscope around three axes.
On one hand, if the measurement parameters do not meet the lane line detection rule, the current road condition can be determined to be a curved road or a sloping road, and the coordinate position of the vanishing point of the lane line does not need to be determined from the lane image;
on the other hand, if the policy parameters satisfy the lane line detection rules, it may be determined that the current road condition is a smooth straight road, and it may be further determined whether the vehicle driving condition is stable. See the description below for details.
The camera may determine whether the current vehicle driving condition is stable through the detected change rate of the lane line parameter. The lane line parameters may include a slope and an intercept of the lane line in the image plane coordinate system.
Specifically, the camera may determine whether a difference between a lane line parameter in the current lane image and a lane line in the previous frame of lane image is less than a preset ratio threshold. The ratio threshold may be a value that is acceptable for detecting the camera attitude based on empirically determined lane line differences during the application process, and may be 10%, for example.
If the lane line parameters are the slope and the intercept, the slope and the intercept of the two lane lines in the current lane image can be subtracted respectively from the slope and the intercept of the two corresponding lane lines in the previous lane image. Further, the difference is divided by the slope and the intercept of the two corresponding lane lines in the previous frame of lane image to obtain four ratios. An average value can be calculated for the four ratios and then compared as a difference to the ratio threshold; alternatively, the maximum of the four ratios is taken directly and then compared as the difference to the ratio threshold.
On one hand, if the difference is not smaller than the ratio threshold, the current vehicle running condition is determined to be unstable, and the coordinate position of the vanishing point of the lane line does not need to be determined from the lane image;
on the other hand, if the difference is smaller than the ratio threshold, it is determined that the current vehicle running condition is stable, and at this time, the coordinate position of the vanishing point of the lane line may be determined from the lane image.
Of course, in the initial stage of the installation of the camera, the vehicle needs to be stopped at the center of the flat straight road, and then the coordinate position of the vanishing point of the lane line in the lane image is determined after the lane image is collected by the camera. And when the coordinate position of the lane line vanishing point is the center of the image, the pitch angle and the yaw angle of the camera are both 0, and the current camera installation posture is determined to be correct.
Step 302: and determining candidate pitch angles and candidate yaw angles of the camera based on the coordinate position of the vanishing point and preset camera parameters.
The imaging parameters may include a coordinate position of an image center of the lane image in the image coordinate system, a focal length of the camera, and a size of a unit pixel of the lane image. The candidate pitch angle refers to a pitch angle of the camera determined through a frame of the lane image, and the candidate yaw angle refers to a yaw angle of the camera determined through a frame of the lane image.
The principle of the present application for determining candidate pitch angles and candidate yaw angles of a camera is explained below.
When the camera is in a normal posture, the coordinate of the vanishing point in the image plane u-v coordinate system at the image center of the lane image is (u)0,v0). When the attitude of the camera changes, the candidate pitch angle of the camera is beta, and the candidate yaw angle is alpha.
The coordinates (u) of the vanishing point in the u-v coordinate system of the image plane1,v1) Can be expressed by the following formula (1):
Figure BDA0001788498420000091
where f is the focal length of the camera, dx represents the size of the pixel in the X-axis direction (u-axis direction), and dy represents the size of the pixel in the Y-axis direction (v-axis direction).
Therefore, the candidate pitch angle of the camera can be expressed by the following formula (2):
Figure BDA0001788498420000092
the candidate yaw angle of the camera can be expressed by the following formula (3):
Figure BDA0001788498420000093
step 303: and calculating the confidence rates of the candidate pitch angles and the candidate yaw angles determined by the specified number of lane images, determining the candidate pitch angle with the highest confidence rate as the actual pitch angle of the camera, and determining the candidate yaw angle with the highest confidence rate as the actual yaw angle of the camera.
In order to reduce calculation errors of the pitch angle and the yaw angle, candidate pitch angles and candidate yaw angles determined based on a plurality of lane images may be recorded, and after candidate pitch angles and candidate yaw angles determined based on a specified number of lane images are recorded, a confidence rate of the candidate pitch angles and a confidence rate of the candidate yaw angles may be calculated, respectively. The specific calculation method may adopt data processing means such as gaussian distribution function and euclidean distance, which is not described herein.
After the calculation is completed, the candidate pitch angle with the highest confidence rate can be determined as the actual pitch angle of the camera, and the candidate yaw angle with the highest confidence rate can be determined as the actual yaw angle of the camera.
In the embodiment of the present application, the camera may be adjusted in attitude based on the determined actual pitch angle and the actual yaw angle.
Such as: if the yaw angle is 1 degree, the camera is shifted by 1 degree to the right, so that the camera can be adjusted by 1 degree to the left. If the yaw angle is-1 degree, the camera is shifted 1 degree to the left, so the camera can be shifted 1 degree to the right.
If the pitch angle is 1 degree, it means that the camera is shifted 1 degree upward, and therefore, the camera can be adjusted 1 degree downward. If the pitch angle is-1 degree, it means that the camera is shifted 1 degree downward, and therefore, the camera can be adjusted 1 degree upward.
In addition, in the embodiment of the application, after the yaw angle and the pitch angle of the camera are determined, the actual distance between the vehicle with the error and the lane lines on the two sides can be adjusted, so that an accurate result can be obtained for the vehicle deviation in the lane line detection. The specific calculation process is as follows:
when the coordinates of a point in space in the world coordinate system are expressed as (X)w,Yw,Zw) The coordinates of this point in the u-v coordinate system of the image plane are denoted (u, v) and the coordinates of this point in the camera coordinate system are (X)c,Yc,Zc). The conversion of the coordinates in the world coordinate system to the coordinates in the image coordinate system can be expressed by the following formula (4):
Figure BDA0001788498420000111
wherein f is the focal length of the camera, dx represents the size of the pixel point in the X-axis direction (u-axis direction), dy represents the size of the pixel point in the Y-axis direction (v-axis direction), t is a translation matrix, R is a rotation matrix which can be calculated based on the determined pitch angle and yaw angle, and the calculation mode can be represented by the following formula (5):
Figure BDA0001788498420000112
wherein alpha is the yaw angle of the camera and beta is the pitch angle of the camera.
The translation matrix t may be represented as
Figure BDA0001788498420000113
Where h represents the height of the camera from the ground.
Taking the coordinates of any two points on the lane line on the image coordinate system to calculate the slope of the lane line, the slope k can be expressed by the following formula (6):
Figure BDA0001788498420000114
wherein (u)1,v1) And (u)2,v2) Two points of the lane line on the image coordinate system.
The abscissa u and the ordinate v of a point in the image coordinate system can both be represented by elements in the L matrix and coordinates in the world coordinate system by the above formula (4), and then be substituted into the above formula (6).
When the camera is in the correct attitude, the optical axis of the camera is parallel to the lane line, so that the points on the lane line are at the same distance from the optical axis, i.e. the X of the points on the lane line in the world coordinate systemwSame, and Y of points on the lane line in the world coordinate systemwBoth are 0, and the Z-axis coordinates of the two points in the world coordinate system are eliminated in the calculation process.
Therefore, the final slope can be expressed by the following formula (7):
Figure BDA0001788498420000115
after all the coefficients are substituted into the calculation, the reduction is carried out, and finally the distance X between the lane line and the camera is obtainedwCan be expressed by the following formula (8):
Figure BDA0001788498420000121
the slope k can be actually determined directly based on the positions of the lane lines in the image, and the slopes of the two lane lines are different, so that the actual distance between the vehicle and the lane lines on the two sides can be calculated through a formula (8); the elements in the L matrix can also be obtained during the calculation of equation (4).
Therefore, after the actual pitch angle and the actual offset angle are determined, the actual distance between the vehicle and the lane lines on both sides can be determined through the above equations (4) to (8).
In summary, the candidate pitch angle and the candidate yaw angle of the camera are determined according to the coordinate position of the vanishing point of the lane line in the lane image in the image coordinate system and the camera shooting parameters, and then the actual pitch angle and the actual yaw angle of the camera are determined based on the specified number of lane images, so that the attitude of the camera is determined according to the actual pitch angle and the actual yaw angle;
by means of the measures, the posture of the camera can be adjusted, and the influence caused by the fact that the posture of the camera is not over against the lane line detection is avoided.
Corresponding to the embodiment of the detection method of the camera posture, the application also provides an embodiment of the detection device of the camera posture.
Referring to fig. 6, a block diagram of an embodiment of a camera pose detection apparatus according to the present application is shown:
as shown in fig. 6, the camera attitude detection device 60 includes:
a first determining unit 610 for determining a coordinate position of a vanishing point of a lane line in the lane image in the image coordinate system; wherein the vanishing point is an intersection point of the two lane lines.
And the calculating unit 620 is used for determining a candidate pitch angle and a candidate yaw angle of the camera based on the coordinate position of the vanishing point and preset shooting parameters.
A second determining unit 630, configured to calculate confidence rates of the pitch angle candidates and the yaw angle candidates determined by the specified number of lane images, determine the pitch angle candidate with the highest confidence rate as the actual pitch angle of the camera, and determine the yaw angle candidate with the highest confidence rate as the actual yaw angle of the camera.
In this example, the apparatus further comprises:
an adjusting unit 640 (not shown in the figure) for performing attitude adjustment on the camera based on the determined actual pitch angle and actual yaw angle.
In this example, the first determining unit 610 is further configured to:
acquiring measurement parameters of a vehicle-mounted sensor, and determining whether the current road condition meets a preset lane line detection rule or not based on the measurement parameters;
if so, determining whether the difference between the lane line parameter in the current lane image and the lane line parameter in the previous lane image is smaller than a preset ratio threshold value;
and if so, determining the coordinate position of the vanishing point of the lane line in the image coordinate system from the current lane image.
In this example, the image pickup parameters include a coordinate position of an image center of the lane image in an image coordinate system, a focal length of a camera, and a unit pixel size of the lane image;
the calculating unit 620 is further configured to:
calculating a candidate pitch angle β of the camera based on a first formula:
Figure BDA0001788498420000131
wherein u is1Is the abscissa, u, of the vanishing point0The abscissa of the image center of the lane image is adopted, dx is the unit pixel size of the lane image in the abscissa direction, and f is the focal length of the camera;
calculating a candidate yaw angle α of the camera based on a second formula:
Figure BDA0001788498420000132
wherein v is1Is the ordinate, v, of the vanishing point0And f is the focal length of the camera, and beta is the candidate pitch angle of the camera.
In this example, the apparatus further comprises:
a third determining unit 650 (not shown) for determining the actual distance of the vehicle from the lane lines on both sides based on the determined actual pitch angle and the actual yaw angle.
The embodiment of the camera posture detection device can be applied to electronic equipment. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. Taking a software implementation as an example, as a logical device, the device is formed by reading, by a processor of the electronic device where the device is located, a corresponding computer program instruction in the nonvolatile memory into the memory for operation. From a hardware aspect, as shown in fig. 7, the present application is a hardware structure diagram of an electronic device where a camera gesture detection apparatus is located, and except for the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 7, the electronic device where the apparatus is located in the embodiment may also include other hardware according to an actual function of the camera gesture detection apparatus, which is not described again.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (8)

1. A method for detecting a pose of a camera, comprising:
acquiring measurement parameters of a vehicle-mounted sensor, and determining whether the current road condition meets a preset lane line detection rule or not based on the measurement parameters; the lane line detection rule is used for indicating that the vehicle runs on a flat straight road;
if so, determining whether the difference between the lane line parameter in the current lane image and the lane line parameter in the previous lane image is smaller than a preset ratio threshold value; the lane line parameters comprise the slope and intercept of the lane line in an image plane coordinate system;
if so, determining the coordinate position of the vanishing point of the lane line in the image coordinate system from the current lane image; wherein the vanishing point is the intersection point of the two lane lines;
determining candidate pitch angles and candidate yaw angles of the camera based on the coordinate position of the vanishing point and preset camera parameters;
and calculating the confidence rates of the candidate pitch angles and the candidate yaw angles determined by the specified number of lane images, determining the candidate pitch angle with the highest confidence rate as the actual pitch angle of the camera, and determining the candidate yaw angle with the highest confidence rate as the actual yaw angle of the camera.
2. The method of claim 1, further comprising:
and performing attitude adjustment on the camera based on the determined actual pitch angle and the actual yaw angle.
3. The method of claim 1,
the determining candidate pitch angle and candidate yaw angle of the camera based on the coordinate position of the vanishing point and the preset shooting parameter comprises the following steps:
calculating a candidate pitch angle β of the camera based on a first formula:
Figure FDA0003554034590000011
wherein v is1Is the ordinate, v, of the vanishing point0The longitudinal coordinate of the image center of the lane image is shown, dy is the unit pixel size of the lane image in the longitudinal coordinate direction, and f is the focal length of the camera;
calculating a candidate yaw angle α of the camera based on a second formula:
Figure FDA0003554034590000021
wherein u is1Is the abscissa, u, of the vanishing point0And dx is the unit pixel size of the lane image in the abscissa direction, which is the abscissa of the image center of the lane image.
4. The method of claim 1, further comprising:
and determining the actual distance between the vehicle and the lane lines on the two sides based on the determined actual pitch angle and the actual deviation angle.
5. A camera pose detection apparatus, comprising:
the first determining unit is used for acquiring the measurement parameters of the vehicle-mounted sensor and determining whether the current road condition meets the preset lane line detection rule or not based on the measurement parameters; the lane line detection rule is used for indicating that the vehicle runs on a flat straight road; if so, determining whether the difference between the lane line parameter in the current lane image and the lane line parameter in the previous lane image is smaller than a preset ratio threshold value; the lane line parameters comprise the slope and intercept of the lane line in an image plane coordinate system; if so, determining the coordinate position of the vanishing point of the lane line in the image coordinate system from the current lane image; wherein the vanishing point is the intersection point of the two lane lines;
the computing unit is used for determining candidate pitch angles and candidate yaw angles of the camera based on the coordinate position of the vanishing point and preset camera shooting parameters;
and the second determining unit is used for calculating the confidence rates of the candidate pitch angles and the candidate yaw angles determined by the specified number of lane images, determining the candidate pitch angle with the highest confidence rate as the actual pitch angle of the camera, and determining the candidate yaw angle with the highest confidence rate as the actual yaw angle of the camera.
6. The apparatus of claim 5, further comprising:
and the adjusting unit is used for carrying out attitude adjustment on the camera based on the determined actual pitch angle and the actual yaw angle.
7. The apparatus according to claim 5, wherein the imaging parameters include a coordinate position of an image center of the lane image in an image coordinate system, a focal length of a camera, and a unit pixel size of the lane image;
the computing unit is further configured to:
calculating a candidate pitch angle β of the camera based on a first formula:
Figure FDA0003554034590000031
wherein v is1Is the ordinate, v, of the vanishing point0The longitudinal coordinate of the image center of the lane image is shown, dy is the unit pixel size of the lane image in the longitudinal coordinate direction, and f is the focal length of the camera;
calculating a candidate yaw angle α of the camera based on a second formula:
Figure FDA0003554034590000032
wherein u is1Is the abscissa, u, of the vanishing point0And dx is the unit pixel size of the lane image in the abscissa direction, which is the abscissa of the image center of the lane image.
8. The apparatus of claim 5, further comprising:
and a third determining unit for determining the actual distance between the vehicle and the lane lines on both sides based on the determined actual pitch angle and the actual offset angle.
CN201811025833.8A 2018-09-04 2018-09-04 Camera posture detection method and device Active CN109345593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811025833.8A CN109345593B (en) 2018-09-04 2018-09-04 Camera posture detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811025833.8A CN109345593B (en) 2018-09-04 2018-09-04 Camera posture detection method and device

Publications (2)

Publication Number Publication Date
CN109345593A CN109345593A (en) 2019-02-15
CN109345593B true CN109345593B (en) 2022-04-26

Family

ID=65293770

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811025833.8A Active CN109345593B (en) 2018-09-04 2018-09-04 Camera posture detection method and device

Country Status (1)

Country Link
CN (1) CN109345593B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110363819B (en) * 2019-06-25 2023-03-03 华为技术有限公司 Method for calibrating image acquisition equipment in intelligent automobile and related equipment
CN112184822B (en) * 2019-07-01 2024-01-30 北京地平线机器人技术研发有限公司 Camera pitch angle adjusting method and device, storage medium and electronic equipment
CN112308923B (en) * 2019-07-25 2024-07-26 北京地平线机器人技术研发有限公司 Camera pose adjustment method and device based on lane lines, storage medium and equipment
CN110718068B (en) * 2019-09-27 2020-12-08 华中科技大学 Road monitoring camera installation angle estimation method
CN112866629B (en) * 2019-11-27 2024-06-21 大富科技(安徽)股份有限公司 Control method and terminal for binocular vision application
CN113515973A (en) * 2020-04-09 2021-10-19 北京地平线机器人技术研发有限公司 Data acquisition method, training method, adjustment method and device
CN111739105B (en) * 2020-08-07 2020-11-20 北京理工大学 Automatic learning method for live broadcast station camera shooting scheme
CN113490967A (en) * 2020-09-22 2021-10-08 深圳市锐明技术股份有限公司 Camera calibration method and device and electronic equipment
WO2022204953A1 (en) * 2021-03-30 2022-10-06 深圳市锐明技术股份有限公司 Method and apparatus for determining pitch angle, and terminal device
CN112837352B (en) * 2021-04-20 2021-11-02 腾讯科技(深圳)有限公司 Image-based data processing method, device and equipment, automobile and storage medium
CN114782549B (en) * 2022-04-22 2023-11-24 南京新远见智能科技有限公司 Camera calibration method and system based on fixed point identification
CN115164823B (en) * 2022-05-16 2024-04-02 上海芯翌智能科技有限公司 Method and device for acquiring gyroscope information of camera
CN115103123A (en) * 2022-07-07 2022-09-23 上海智能交通有限公司 Automatic vehicle-mounted holder camera distribution and control method and system based on lane line identification
CN116309814B (en) * 2022-11-29 2024-03-08 北京斯年智驾科技有限公司 Vehicle pose determination method, device, computing equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102324017A (en) * 2011-06-09 2012-01-18 中国人民解放军国防科学技术大学 FPGA (Field Programmable Gate Array)-based lane line detection method
CN103903435A (en) * 2012-12-30 2014-07-02 王方淇 Road condition information obtaining method and mobile terminal
CN105320934A (en) * 2014-07-11 2016-02-10 株式会社电装 Lane boundary line recognition device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102024139A (en) * 2009-09-18 2011-04-20 富士通株式会社 Device and method for recognizing character strings
CN104008645B (en) * 2014-06-12 2015-12-09 湖南大学 One is applicable to the prediction of urban road lane line and method for early warning
CN105447892B (en) * 2015-11-05 2018-04-17 奇瑞汽车股份有限公司 The definite method and device at vehicle yaw angle
US10694175B2 (en) * 2015-12-28 2020-06-23 Intel Corporation Real-time automatic vehicle camera calibration
CN105740809B (en) * 2016-01-28 2019-03-12 东南大学 A kind of highway method for detecting lane lines based on Airborne camera
CN106875448B (en) * 2017-02-16 2019-07-23 武汉极目智能技术有限公司 A kind of vehicle-mounted monocular camera external parameter self-calibrating method
CN108052908A (en) * 2017-12-15 2018-05-18 郑州日产汽车有限公司 Track keeping method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102324017A (en) * 2011-06-09 2012-01-18 中国人民解放军国防科学技术大学 FPGA (Field Programmable Gate Array)-based lane line detection method
CN103903435A (en) * 2012-12-30 2014-07-02 王方淇 Road condition information obtaining method and mobile terminal
CN105320934A (en) * 2014-07-11 2016-02-10 株式会社电装 Lane boundary line recognition device

Also Published As

Publication number Publication date
CN109345593A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN109345593B (en) Camera posture detection method and device
US11210534B2 (en) Method for position detection, device, and storage medium
CN108885791B (en) Ground detection method, related device and computer readable storage medium
CN108805934B (en) External parameter calibration method and device for vehicle-mounted camera
CN110163912B (en) Two-dimensional code pose calibration method, device and system
EP2399239B1 (en) Estimation of panoramic camera orientation relative to a vehicle coordinate frame
CN109902637A (en) Method for detecting lane lines, device, computer equipment and storage medium
WO2022078074A1 (en) Method and system for detecting position relation between vehicle and lane line, and storage medium
JP5105481B2 (en) Lane detection device, lane detection method, and lane detection program
CN108376384B (en) Method and device for correcting disparity map and storage medium
JP5228614B2 (en) Parameter calculation apparatus, parameter calculation system and program
CN111508272A (en) Method and apparatus for providing robust camera-based object distance prediction
CN112017236A (en) Method and device for calculating position of target object based on monocular camera
CN113838125A (en) Target position determining method and device, electronic equipment and storage medium
CN110720113A (en) Parameter processing method and device, camera equipment and aircraft
CN117745845A (en) Method, device, equipment and storage medium for determining external parameter information
CN114037977B (en) Road vanishing point detection method, device, equipment and storage medium
JP3319383B2 (en) Roadway recognition device
CN109600598B (en) Image processing method, image processing device and computer readable recording medium
US10643077B2 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program
CN114299466A (en) Monocular camera-based vehicle attitude determination method and device and electronic equipment
CN110836656B (en) Anti-shake distance measuring method and device for monocular ADAS (adaptive Doppler analysis System) and electronic equipment
CN114119885A (en) Image feature point matching method, device and system and map construction method and system
CN113766175A (en) Target monitoring method, device, equipment and storage medium
CN116630374B (en) Visual tracking method, device, storage medium and equipment for target object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant