CN114730472A - Calibration method for external parameters of vehicle-mounted camera and related device - Google Patents

Calibration method for external parameters of vehicle-mounted camera and related device Download PDF

Info

Publication number
CN114730472A
CN114730472A CN202180006501.9A CN202180006501A CN114730472A CN 114730472 A CN114730472 A CN 114730472A CN 202180006501 A CN202180006501 A CN 202180006501A CN 114730472 A CN114730472 A CN 114730472A
Authority
CN
China
Prior art keywords
coordinate system
vehicle
parallel
camera
parallel lines
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180006501.9A
Other languages
Chinese (zh)
Inventor
何启盛
李涵
黄海晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN114730472A publication Critical patent/CN114730472A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a method and a related device for calibrating external parameters of a vehicle-mounted camera in the technical field of camera calibration. According to the technical scheme, a transformation relation between an actual camera coordinate system of a vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera is determined according to constraints of coordinates of M first parallel lines in an acquired first image in the actual camera coordinate system of the vehicle-mounted camera and M second parallel lines in a bird's-eye view of the first image, wherein M is an integer larger than or equal to 3, and the M second parallel lines correspond to the M first parallel lines one to one; according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is determined, and the flexibility of the external parameter calibration method of the vehicle-mounted camera is improved.

Description

Calibration method of external parameters of vehicle-mounted camera and related device
Technical Field
The present disclosure relates to the field of camera calibration technologies, and in particular, to a method and a related device for calibrating external parameters of a vehicle-mounted camera.
Background
The vehicle-mounted camera plays an increasingly important role as a sensor in assisting driving and automatic driving, and necessary information can be provided for safe driving by associating the environment around the vehicle with the digital image captured by the vehicle-mounted camera. The extrinsic parameters of the onboard camera play an important role in correlating the environment surrounding the vehicle with the digital images captured by the onboard camera. The external parameters of the vehicle-mounted camera refer to the translation distance and the rotation angle of the vehicle-mounted camera relative to the vehicle.
At present, external parameters of the vehicle-mounted camera are usually calibrated by means of parallel lines, and the parallel lines used for external parameter calibration of the vehicle-mounted camera can be lane lines drawn in a specific field or lane lines on a road. In the calibration process, many existing technical solutions are provided with specific constraint conditions, for example, the distances between three parallel lines used for external reference calibration of the vehicle-mounted camera are required to be equal. Specifically, the coordinates of a pixel coordinate system of three parallel lines are obtained, the coordinates of the pixel coordinate system of the three parallel lines are converted into coordinates of a camera coordinate system, and the distances between the three parallel lines are equal to each other to serve as constraint conditions, so that external parameters of the vehicle-mounted camera are obtained. However, the calibration of the vehicle-mounted camera by using equidistant parallel lines makes the calibration method of the vehicle-mounted camera less flexible.
Therefore, how to improve the flexibility of the external reference calibration method of the vehicle-mounted camera becomes a problem to be solved urgently.
Disclosure of Invention
The application provides a calibration method and a related device for external parameters of a vehicle-mounted camera, and improves the flexibility of the calibration method for external parameters of the vehicle-mounted camera.
In a first aspect, the present application provides a calibration method for external parameters of a vehicle-mounted camera, where the method includes: acquiring coordinates of M first parallel lines in a first image in an actual camera coordinate system of the vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle to which the vehicle belongs on a target road in a driving process, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer greater than or equal to 3, and N is an integer greater than or equal to M; determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in an actual camera coordinate system of the vehicle-mounted camera and constraints of M second parallel lines in a bird's-eye view of the first image, wherein the M second parallel lines correspond to the M first parallel lines one by one, the constraints of the M second parallel lines in the bird's-eye view comprise a parallel constraint, a vertical constraint, a spacing proportion constraint and a spacing constraint, the parallel constraint comprises the M second parallel lines in parallel, the vertical constraint comprises any one second parallel line of the M second parallel lines is perpendicular to the x-axis direction of the virtual ideal camera coordinate system, the spacing proportion constraint comprises the distance proportion between any two second parallel lines of the M second parallel lines and the constraints between the M lane lines corresponding to the any two second parallel lines The distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between M lane lines corresponding to any two second parallel lines is minimum; and determining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
According to the method, firstly, a transformation relation between an actual camera coordinate system of a vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera is determined according to the coordinates of M first parallel lines in an acquired first image in the actual camera coordinate system of the vehicle-mounted camera and the constraints of M second parallel lines in a bird's-eye view of the first image, wherein M is an integer larger than or equal to 3, the M second parallel lines correspond to the M first parallel lines one to one, and the constraints of the M second parallel lines in the bird's-eye view comprise parallel constraints, vertical constraints, spacing proportion constraints and spacing constraints; and determining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system of the vehicle-mounted camera according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system of the vehicle-mounted camera and the world coordinate system, wherein the first image is an image shot by a vehicle of the vehicle-mounted camera in the running process on a target road, the target road comprises N parallel lane lines, N is an integer larger than or equal to M, by introducing the virtual ideal camera coordinate system of the vehicle-mounted camera as a transfer coordinate system, the distance between any two adjacent lane lines in the N lane lines is only required to be known, namely, the parallel constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the M second parallel lines in the bird's eye view of the first image, the distance between the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the M second parallel constraint of the first image, The method comprises the steps of vertical constraint, space proportion constraint and space constraint, determining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera, and improving the flexibility of the external reference calibration method of the vehicle-mounted camera without setting other specific constraint conditions for N lane lines in a target road.
In one possible implementation, the determining a transformation relationship between the actual camera coordinate system of the onboard camera and the virtual ideal camera coordinate system of the onboard camera according to constraints of the M first parallel lines in the actual camera coordinate system of the onboard camera and the M second parallel lines in the bird's eye view of the first image includes: determining an x-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to coordinates of the M first parallel lines in the actual camera coordinate system and the parallel constraint; determining a y-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system and the vertical constraint; determining a z-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to coordinates of the M first parallel lines in the actual camera coordinate system and the spacing proportion constraint; and determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the spacing constraint.
In the implementation mode, the x-axis rotation angle of a virtual ideal camera coordinate system relative to an actual camera coordinate system is determined according to coordinates and parallel constraints of M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to coordinate and spacing proportion constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to coordinate and spacing constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, and the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is obtained, the accuracy of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is improved.
In one possible implementation, the parallel constraint further includes: 0.5 sigma (theta)ii+1)2Minimum where thetaiIs the angle theta between the ith second parallel line of the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate systemi+1Is the included angle between the (i + 1) th second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, thetaiLess than or equal to 90 degrees, thetai+1Less than or equal to 90 degrees.
In this implementation, 0.5 ∑ (θ)ii+1)2The x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined as a minimum parallel constraint, and the accuracy of the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is improved.
In one possible implementation, the vertical constraint further comprises: 0.5 sigma (theta)j-90)2Minimum, θjIs the included angle between the jth second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, thetajLess than or equal to 90 degrees.
In this implementation, 0.5 ∑ (θ)j-90)2The y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined as a minimum vertical constraint, improving the accuracy of the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
In one possible implementation, the pitch ratio constraint further includes:
Figure BDA0003646238140000031
minimum, where ωs,s+1Is the distance between the s second parallel line and the s +1 second parallel line in the M second parallel lines, omegas+1,s+2Is the distance between the (s + 1) th and (s + 2) th second parallel lines of the M second parallel lines, Ws,s+1Is the actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s + 1-th second parallel line in the M second parallel lines, Ws+1,s+2The actual distance between the lane line corresponding to the (s + 1) th second parallel line and the lane line corresponding to the (s + 2) th second parallel line in the M second parallel lines is shown, wherein s is an integer which is more than or equal to 1 and less than M-1.
In this implementation, the
Figure BDA0003646238140000032
At minimum asThe spacing proportion constraint determines the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system, and improves the accuracy of the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
In one possible implementation, the spacing constraint further includes 0.5 ∑ (ω ∑)i,i+1-Wi,i+1)2Minimum, ωi,i+1Is the distance between the ith and the (i + 1) th second parallel lines in the M second parallel lines, Wi,i+1And the actual distance between the lane line corresponding to the ith second parallel line and the lane line corresponding to the (i + 1) th second parallel line in the M second parallel lines is represented, wherein i is an integer which is more than or equal to 1 and less than M.
In this implementation, 0.5 ∑ (ω) (. omega.) will be usedi,i+1-Wi,i+1)2And the minimum distance constraint is used for determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, so that the accuracy of the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is improved.
In one possible implementation, the method further includes: acquiring the y-axis translation amount of the vehicle-mounted camera in the world coordinate system; acquiring positioning information of a vehicle, wherein the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle; establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system; acquiring a course angle of the vehicle at the current moment according to the observation model of the vehicle; compensating the course angle of the vehicle at the current moment according to the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to obtain a compensated course angle; and updating the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system into the compensated course angle.
In the implementation mode, the course angle of the vehicle at the current moment is obtained according to an observation model of the vehicle, which is established by the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, the course angle of the vehicle at the current moment is compensated by using the course angle of the vehicle-mounted camera in external parameters under the world coordinate system to obtain the compensated course angle, and the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to the compensated course angle, so that the influence of the course angle of the vehicle can be compensated when the vehicle runs not parallel to a lane line, and the accuracy of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is improved.
In a second aspect, the present application provides an apparatus for calibrating external parameters of a vehicle-mounted camera, where the apparatus may include various modules for implementing the method in the first aspect, and the modules may be implemented by software and/or hardware.
In a third aspect, the present application provides a calibration apparatus for external parameters of a vehicle-mounted camera. The apparatus may include a processor coupled with a memory. Wherein the memory is configured to store program code and the processor is configured to execute the program code in the memory to implement the method of the first aspect or any one of the implementations.
Optionally, the apparatus may further comprise the memory.
In a fourth aspect, the present application provides a chip, which includes at least one processor and a communication interface, where the communication interface and the at least one processor are interconnected by a line, and the at least one processor is configured to execute a computer program or instructions to perform the method according to the first aspect or any one of the possible implementation manners.
In a fifth aspect, the present application provides a computer readable medium storing program code for execution by a device, the program code comprising instructions for performing the method according to the first aspect or any one of its possible implementations.
In a sixth aspect, the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method according to the first aspect or any one of its possible implementations.
In a seventh aspect, the present application provides a computing device, including at least one processor and a communication interface, where the communication interface and the at least one processor are interconnected by a line, the communication interface is in communication with a target system, and the at least one processor is configured to execute a computer program or instructions to perform the method according to the first aspect or any one of the possible implementations.
In an eighth aspect, the present application provides a computing system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the communication interface being in communication with a target system, the at least one processor being configured to execute a computer program or instructions to perform the method according to the first aspect or any one of the possible implementations.
Drawings
FIG. 1 is a schematic diagram of a pixel coordinate system;
FIG. 2 is a schematic diagram of a camera coordinate system;
FIG. 3 is a schematic diagram of an actual camera coordinate system and a virtual ideal camera coordinate system of an embodiment of the present application;
FIG. 4 is a schematic view of a world coordinate system of an embodiment of the present application;
FIG. 5 is a schematic diagram of an application scenario according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a calibration method for external parameters of a vehicle-mounted camera according to an embodiment of the present application;
FIG. 7 is a schematic illustration of a first image according to an embodiment of the present application;
FIG. 8 is a schematic view of a bird's eye view according to an embodiment of the present application;
FIG. 9 is a schematic view of a course angle compensation in accordance with an embodiment of the present application;
FIG. 10 is a schematic flowchart illustrating another method for calibrating extrinsic parameters of a vehicle-mounted camera according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a calibration apparatus for external parameters of a vehicle-mounted camera according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a calibration apparatus for external parameters of a vehicle-mounted camera according to another embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic diagram of a pixel coordinate system. As shown in FIG. 1, the vertex at the upper left corner of the pixel coordinate system is the origin OpThe u axis is horizontal to the right and the v axis is vertical to the bottom.
The pixel coordinates refer to the location of the pixel in the image. In the pixel coordinate system, the coordinate of any one pixel point can be expressed as (u)i,vi). The representation of the pixels does not reflect the physical size of the objects in the image.
Fig. 2 is a schematic diagram of a camera coordinate system. As shown in FIG. 2, the camera coordinate system takes the optical axis of the camera as ZcThe central position of the axis and the ray in the optical system of the camera is the origin OcIn fact the center of the lens. Horizontal axis X of camera coordinate systemcAnd the vertical axis YcParallel to the u-axis and the v-axis of the pixel coordinate system, respectively.
Fig. 3 is a schematic diagram of an actual camera coordinate system and a virtual ideal camera coordinate system according to an embodiment of the present application. As shown in fig. 3, L1, L2 and L3 are three parallel lines on the traveling road surface of the automobile, and the center of the camera lens is selected as the origin O of the actual camera coordinate system according to the selection rule and the positional relationship of the origin, the x-axis, the y-axis and the z-axis of the camera coordinate system shown in fig. 2r(ii) a Selecting the optical axis of the camera as the Z of the actual camera coordinate systemr,ZrParallel to the running road surface of the automobile and in front of the camera ZrThe positive direction of (1); selection and ZrPerpendicular to the vehicleThe direction parallel to the driving surface is X of the actual camera coordinate systemrSelecting the direction from L2 to L3 as XrThe direction perpendicular to the running road surface of the automobile is selected as Yr(not shown) illustratively, the direction of the inward direction taken perpendicular to the surface of the vehicle is YrIn the forward direction of (c).
Origin of virtual ideal camera coordinate system and origin O of actual camera coordinate systemrCoincidence, Y-axis and Y of virtual ideal camera coordinate systemrAnd the directions are coincident. Will be Z of the actual camera coordinate systemrAnd XrAround YrRotated by an angle of beta until XrAnd when the virtual ideal camera coordinate system is vertical to three parallel lines on the running road of the automobile, the z-axis direction and the x-axis direction of the virtual ideal camera coordinate system are obtained.
Fig. 4 is a schematic diagram of a world coordinate system of an embodiment of the present application. As shown in fig. 4, L1, L2 and L3 are three parallel lines located on the driving road of the automobile, ZOX is a virtual ideal camera coordinate system of the camera, and any point on L2 is selected as an origin O of a world coordinate systemwX of the world coordinate systemwIs aligned with the Z-axis direction of the virtual ideal camera coordinate system of the camera, and the Y-axis direction of the world coordinate systemwPerpendicular to L2 and parallel to the running road surface of the automobile, and the direction from L2 to L1 is selected as YwIn the forward direction of (c).
Fig. 5 is a schematic diagram of an application scenario according to an embodiment of the present application. The scenario shown in fig. 5 is a scenario in which the extrinsic parameters of the camera are calibrated using three parallel lines L1, L2, and L3 of known parallel spacing on the driving surface of the vehicle. And the calibration of the external parameters of the camera is completed by driving the vehicle from one end of the three parallel lines to the other end. The three parallel lines L1, L2 and L3 with known parallel spacing on the driving road surface can be lane lines drawn for a specific field and can also be lane lines on a normal road.
It is understood that the scenario shown in fig. 5 is only an example, and the technical solution of the present application may also be applied to other scenarios as long as the scenario involves calibration of external parameters of the camera. For example, the technical scheme of the application can also be applied to scenes such as calibration of external parameters of a camera in the intelligent robot.
Fig. 6 is a schematic flowchart of a calibration method for external parameters of a vehicle-mounted camera according to an embodiment of the present application. As shown in fig. 6, the method includes at least S601 to S603.
S601, obtaining coordinates of M first parallel lines in a first image in an actual camera coordinate system of a vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle of the vehicle-mounted camera in a running process on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines correspond to the M first lane lines in the N lane lines one by one, M is an integer larger than or equal to 3, and N is an integer larger than or equal to M.
In a possible implementation manner, during the driving process of the vehicle belonging to the vehicle-mounted camera on the target road, the vehicle-mounted camera shoots M lane lines of the N lane lines on the target road to obtain the first image. The first image comprises M first parallel lines, the M first parallel lines are obtained by shooting M lane lines in N lane lines on a target road through a vehicle-mounted camera, and the M first parallel lines in the first image correspond to the M lane lines in the N lane lines on the target road one to one.
Illustratively, fig. 7 is a schematic diagram of a first image according to an embodiment of the present application. As shown in fig. 7, there are 3 first parallel lines L1, L2, and L3 in the first image, which correspond to 3 lane lines of the N lane lines in the target road, respectively, where M is equal to 3 and N is greater than or equal to 3.
Extracting M first parallel lines in the first image to obtain coordinates of the M first parallel lines in a pixel coordinate system, and converting the coordinates of the M parallel lines in the pixel coordinate system into the coordinates of the M parallel lines in a camera coordinate system of the vehicle-mounted camera. Among them, one example of the pixel coordinate system may be the pixel coordinate system shown in fig. 1, and another example of the camera coordinate system may be the camera coordinate system of the actual camera shown in fig. 3.
As an example, parallel line regions and contours in the first image are extracted by a segmentation algorithm, and then edge extraction is performed at a sub-pixel level, so as to obtain M first parallel lines. For example, the segmentation algorithm includes a watershed algorithm, and the like.
As another example, straight lines in the first image are extracted through hough transform, and then M first parallel lines in the first image are obtained by using methods such as clustering and filtering. The Hough transform is a feature extraction technology in image processing, and can detect an object with a specific shape through a voting algorithm; clustering straight line segments with similar slope and intercept by using a clustering method; the filtering method is to set an interested area for screening by referring to the installation angle and the position of the camera.
S602, determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of the M second parallel lines in the aerial view of the first image, wherein the M second parallel lines correspond to the M first parallel lines one to one, and the constraints of the M second parallel lines in the aerial view comprise parallel constraints, vertical constraints, space proportion constraints and space constraints.
The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is x-axis rotation angle, y-axis rotation angle, z-axis rotation angle and z-axis translation distance of the virtual ideal camera coordinate system of the vehicle-mounted camera relative to the actual camera coordinate system of the vehicle-mounted camera. Among them, an example of the actual camera coordinate system of the in-vehicle camera may be the actual camera coordinate system shown in fig. 3, and an example of the virtual ideal camera coordinate system of the in-vehicle camera may be the virtual ideal camera coordinate system shown in fig. 3.
In one possible implementation manner, an inverse perspective transformation method is adopted to obtain a bird's-eye view of the M first parallel lines according to coordinates of the M first parallel lines in an actual camera coordinate system of the vehicle-mounted camera. The aerial view is provided with M second parallel lines, and the M second parallel lines correspond to the M first parallel lines one to one.
Exemplarily, fig. 8 is a schematic diagram of a bird's eye view according to an embodiment of the present application. As shown in fig. 8, a birdThere are 3 second parallel lines in the aerial view, M equals 3, L1, L2 and L3, where θ1Is the angle between the second parallel line L1 and the x-axis direction of the virtual ideal coordinate system, theta2Is the angle between the second parallel line L2 and the x-axis direction of the virtual ideal coordinate system, theta3Is the angle between the second parallel line L3 and the x-axis direction of the virtual ideal coordinate system, omega12Is the distance, ω, between the second parallel line L1 and the second parallel line L223Is the distance between the second parallel line L2 and the second parallel line L3.
The parallel constraint comprises M second parallel lines, the vertical constraint comprises that any one of the M second parallel lines is perpendicular to the x-axis direction of the virtual ideal camera coordinate system, the distance proportion constraint comprises that the distance proportion between any two second parallel lines in the M second parallel lines is the same as the distance proportion between the M lane lines corresponding to any two second parallel lines, and the distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between the M lane lines corresponding to any two second parallel lines is minimum.
In one possible implementation manner, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is sequentially optimized by adopting the optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
And determining the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system by methods such as optimization and the like according to the coordinates and parallel constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the parallel constraint includes making the loss function 0.5 ∑ (θ)ii+1)2Minimum, θiIs the angle theta between the ith second parallel line of the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate systemi+1Is the included angle between the (i + 1) th second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, thetaiLess than or equal to 90 degrees, thetai+1Less than or equal to 90 degrees.
And determining the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system by methods such as optimization according to the coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the vertical constraint includes making the loss function 0.5 ∑ (θ)j-90)2Minimum, θjIs the included angle between the jth second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, thetajLess than or equal to 90 degrees.
And determining the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system by methods such as optimization according to the coordinate and space proportion constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the pitch scaling constraint includes making a loss function
Figure BDA0003646238140000071
Minimum, wherein, ωs,s+1Is the distance between the s second parallel line and the s +1 second parallel line in the M second parallel lines, omegas+1,s+2Is the distance between the (s + 1) th and (s + 2) th second parallel lines of the M second parallel lines, Ws,s+1Is the actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s + 1-th second parallel line in the M second parallel lines, Ws+1,s+2The actual distance between the lane line corresponding to the (s + 1) th second parallel line and the lane line corresponding to the (s + 2) th second parallel line in the M second parallel lines is shown, and s is an integer which is more than or equal to 1 and less than M-1.
And determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system by methods such as optimization according to the coordinate and space constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the spacing constraint further includes 0.5 ∑ (ω)i,i+1-Wi,i+1)2Minimum, ωi,i+1Is the distance between the ith and the (i + 1) th second parallel lines in the M second parallel lines, Wi,i+1And the actual distance between the lane line corresponding to the ith second parallel line and the lane line corresponding to the (i + 1) th second parallel line in the M second parallel lines is represented, wherein i is an integer which is more than or equal to 1 and less than M.
By using the optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, errors caused by coupling of Euler angles can be avoided, and the accuracy of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is improved.
In another possible implementation, the constraint of the M second parallel lines in the bird's eye view includes making the loss function α Σ (θ)n-90)2+β∑(ωm,m+1-Wm,m+1)2Minimum, where α and β are weight parameters, θnIs the angle between the nth second parallel line of the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, omegam,m+1Is the distance between the M second parallel line and the M +1 second parallel line in the M second parallel lines, Wm,m+1The actual distance between the lane line corresponding to the mth second parallel line and the lane line corresponding to the (M + 1) th second parallel line is the M second parallel lines, n is an integer greater than or equal to 1 and less than or equal to M, and M is an integer greater than or equal to 1 and less than M.
By adopting the constraint method provided by the implementation mode, the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system can be obtained simultaneously by combining optimization methods and the like, but due to improper initial value selection, only a local optimization value of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is obtained, but not a global optimal value.
S603, determining the transformation relation between the actual camera coordinate system and the world coordinate system of the vehicle-mounted camera according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
The transformation relation between the virtual ideal camera coordinate system and the world coordinate system is an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle-mounted camera. The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is an external parameter of the vehicle-mounted camera, and includes an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera, and may also be referred to as a roll angle (roll), a pitch angle (pitch), a yaw angle (yaw) and a translation distance. Among them, an example of the actual camera coordinate system of the onboard camera may be the actual camera coordinate system of the onboard camera shown in fig. 3, an example of the virtual ideal camera coordinate system of the onboard camera may be the virtual ideal camera coordinate system of the onboard camera shown in fig. 3 or 4, and an example of the world coordinate system may be the world coordinate system shown in fig. 4.
In one possible implementation, the transformation matrix is used to transform the data into a matrix
Figure BDA0003646238140000081
And converting the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera into the transformation relation between the virtual ideal camera coordinate system and the world coordinate system to obtain the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system, namely the external parameter of the vehicle-mounted camera.
After the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is obtained, the y-axis translation amount of the vehicle-mounted camera in the world coordinate system is obtained; and acquiring the positioning information of the vehicle from a sensor of the vehicle to which the vehicle-mounted camera belongs, wherein the sensor of the vehicle to which the vehicle-mounted camera belongs comprises a wheel speed meter, an Inertial Measurement Unit (IMU), a Global Positioning System (GPS) and the like, and correspondingly, the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle and the like.
Establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in a world coordinate system; acquiring a course angle of the vehicle at the current moment according to an observation model of the vehicle; compensating the course angle of the vehicle at the current moment according to the obtained course angle in the external parameters of the vehicle-mounted camera to obtain a compensated course angle; and updating the course angle in the external parameters of the vehicle-mounted camera to be the compensated course angle.
Exemplarily, fig. 9 is a schematic view of a course angle compensation according to an embodiment of the present application, as shown in fig. 9, a vehicle course angle is a course angle of a vehicle at a current time obtained according to an observation model of the vehicle, a camera course angle is a course angle in an external reference of a determined vehicle-mounted camera, the camera course angle is subtracted from the vehicle course angle to obtain a course angle from the camera to the vehicle, which may also be referred to as a compensated course angle, and the determined course angle in the vehicle-mounted camera is updated to the compensated course angle to compensate for an influence of the vehicle course angle obtained by vehicle positioning.
In another possible implementation manner, the plurality of external parameters of the vehicle-mounted camera after the course angle compensation is completed are cached, the plurality of external parameters of the vehicle-mounted camera after the course angle compensation is completed are filtered, the filtered external parameters of the vehicle-mounted camera are obtained, and the external parameters of the vehicle-mounted camera are updated to the filtered external parameters of the vehicle-mounted camera.
Illustratively, a plurality of external parameters of the vehicle-mounted camera after course angle compensation are cached, and periodic optimal estimation is carried out. If the external parameters of the vehicle-mounted camera after the course angle compensation is completed in the cache do not exceed the preset threshold value when the vehicle runs out of the calibration field, the external parameters of the vehicle-mounted camera after the course angle compensation is completed in the cache are all used for carrying out optimal estimation and completing calibration; and if the external parameters of the vehicle-mounted camera after the course angle compensation is completed in the cache exceed the preset threshold when the vehicle runs out of the calibration field, selecting the external parameters of the vehicle-mounted camera after the course angle compensation, which are cached at intervals of a certain number (such as 10 frames), for optimal estimation. The calibration site is an area where N lane lines in a target road for external reference calibration of the vehicle-mounted camera are located.
Preferably, a group of data is cached for each type of external parameter in the external parameters of the vehicle-mounted camera after course angle compensation is completed, that is, a group of data is cached for a roll angle, a pitch angle, a course angle and a translation distance in the external parameters of the vehicle-mounted camera after course angle compensation is completed, and a kernel density estimation method is adopted to obtain an external parameter value with the maximum probability in each group of data, namely the optimal estimation of the type of external parameter.
Specifically, the kernel density estimation method is to fit a gaussian function of a statistical histogram of data, find a gaussian function with the best histogram fitting effect with known data, and can resist a certain amount of noise interference. For each type of external parameter, a fitted kernel function (preferably a Gaussian kernel function) is estimated through a group of external parameter values corresponding to each type, the peak value of the kernel function is found, and the corresponding abscissa value (where the probability of the Gaussian function is maximum) is the optimal value of each type of external parameter.
Alternatively, the optimal value of each external parameter of the four types of external parameters can be estimated simultaneously by adopting a four-dimensional kernel density estimation method.
Optionally, a group of data is cached for each type of external parameter, and a median of each type of external parameter is obtained by using a median filtering method, that is, the median is the optimal value of each type of external parameter.
Optionally, a group of data is cached for each type of external parameter, an average value of each group of data is calculated by adopting an average filtering method, and the obtained average value of each group of data is the optimal value of each type of external parameter.
And when the standard deviation of the obtained optimal value of each type of external parameter is smaller than a specific threshold value, the calibration result is converged, and the calibration is stopped to obtain the external parameters of the vehicle-mounted camera.
According to the technical scheme, transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera is determined according to the coordinate of M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the parallel constraint, the vertical constraint, the spacing proportion constraint and the spacing constraint of M second parallel lines in a bird's eye view of the first image, wherein the M first parallel lines in the first image are acquired by the vehicle-mounted camera, and the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is determined according to the optimization sequence of x-axis rotation angle, y-axis rotation angle, z-axis rotation angle and z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system; and then determining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system, so that the flexibility of the external reference calibration method of the vehicle-mounted camera is improved, and the accuracy of the transformation relation (namely, external parameters of the vehicle-mounted camera) between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is improved.
Fig. 10 is a schematic flowchart of another calibration method for external parameters of a vehicle-mounted camera according to an embodiment of the application. As shown in fig. 10, the method includes at least S1001 to S1011.
S1001, obtaining coordinates of M first parallel lines in a first image in an actual camera coordinate system of a vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle of the vehicle-mounted camera in a running process on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines correspond to the M first lane lines in the N lane lines one by one, M is an integer larger than or equal to 3, and N is an integer larger than or equal to M.
S1002, determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of the M second parallel lines in the aerial view of the first image, wherein the M second parallel lines correspond to the M first parallel lines one to one, and the constraints of the M second parallel lines in the aerial view comprise parallel constraints, vertical constraints, space proportion constraints and space constraints.
It should be noted that S601 to S602 may be referred to in S1001 to S1002, and are not described herein again.
S1003, determining a first external parameter of the vehicle-mounted camera according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
The transformation relation between the virtual ideal camera coordinate system and the world coordinate system is x-axis rotation angle, y-axis rotation angle, z-axis rotation angle and z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle-mounted camera. The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is an external parameter of the vehicle-mounted camera, and includes an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera, and may also be referred to as a roll angle (roll), a pitch angle (pitch), a yaw angle (yaw) and a translation distance. Here, an example of the actual camera coordinate system of the onboard camera may be the actual camera coordinate system of the onboard camera shown in fig. 3, an example of the virtual ideal camera coordinate system of the onboard camera may be the virtual ideal camera coordinate system of the onboard camera shown in fig. 3 or 4, and an example of the world coordinate system may be the world coordinate system shown in fig. 4.
In one possible implementation, the transformation matrix is used to transform the data into a matrix
Figure BDA0003646238140000101
And converting the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera into the transformation relation between the virtual ideal camera coordinate system and the world coordinate system to obtain a first transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system, namely a first external parameter of the vehicle-mounted camera.
And S1004, acquiring the y-axis translation amount of the vehicle-mounted camera in the world coordinate system according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
As an example, in the world coordinate system as shown in FIG. 4, the onboard camera is at Y of the world coordinate systemwThe position on the axis is the y-axis translation amount of the vehicle-mounted camera in the world coordinate system.
S1005, establishing an observation model of the vehicle according to the positioning information of the vehicle to which the vehicle-mounted camera belongs and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, wherein the observation model comprises the influence of the y-axis translation amount on the state quantity.
In one possible implementation, the location information of the vehicle is obtained from a sensor of the vehicle to which the vehicle-mounted camera belongs, wherein the sensor of the vehicle to which the vehicle-mounted camera belongs includes a wheel speed meter, an IMU, a GPS, and the like, and accordingly the location information of the vehicle includes current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle, and the like.
And S1006, acquiring the course angle of the vehicle at the current moment according to the observation model of the vehicle.
In one possible implementation, the observation model of the vehicle is filtered and updated, and the heading angle of the vehicle at the current moment is obtained.
S1007, the course angle of the vehicle at the current moment is compensated according to the course angle in the first external parameter of the vehicle-mounted camera, and the compensated course angle is obtained.
In a possible implementation manner, as shown in the schematic view of the course angle compensation shown in fig. 9, the vehicle course angle is the course angle of the vehicle at the current time obtained according to the observation model of the vehicle, the camera course angle is the course angle in the determined external parameters of the vehicle-mounted camera, the camera course angle is subtracted from the vehicle course angle to obtain the course angle from the camera to the vehicle, which can also be called the compensated course angle, and the determined course angle in the vehicle-mounted camera is updated to the compensated course angle to compensate the influence of the vehicle course angle obtained by positioning the vehicle.
And S1008, updating the course angle in the first external parameter of the vehicle-mounted camera to the compensated course angle.
The first external parameters of the vehicle-mounted camera comprise a first roll angle, a first pitch angle, a first course angle and a first translation distance. And updating a first course angle in the first external parameters of the vehicle-mounted camera into a compensated course angle, wherein the updated first external parameters of the vehicle-mounted camera comprise a first roll angle, a first pitch angle, a compensated course angle and a first translation distance.
S1009, caching a plurality of first external parameters of the vehicle-mounted camera.
In a possible implementation manner, in the calibration process of the external parameters of the vehicle-mounted camera, during the period from the time when a vehicle to which the vehicle-mounted camera belongs enters the calibration site to the time when the vehicle leaves the calibration site, the vehicle-mounted camera shoots multiple first images, multiple first external parameters of the vehicle-mounted camera are obtained according to the multiple first images, and the multiple first external parameters of the vehicle-mounted camera are cached. The calibration site is an area where N lane lines in a target road for external reference calibration of the vehicle-mounted camera are located.
S1010, filtering the cached first external parameters of the vehicle-mounted camera to obtain second external parameters of the vehicle-mounted camera.
In one possible implementation, a plurality of first external parameters of the cached vehicle-mounted camera are periodically and optimally estimated.
Optionally, if the number of the plurality of cached first external parameters of the vehicle-mounted camera does not exceed a preset threshold value during a period from the time that the vehicle of the vehicle-mounted camera enters the calibration site to the time that the vehicle of the vehicle-mounted camera exits the calibration site, the plurality of cached first external parameters of the vehicle-mounted camera are all used for carrying out optimal estimation and completing calibration; and if the number of the plurality of first external parameters of the cached vehicle-mounted camera exceeds a preset threshold value during the period from the driving-in calibration site to the driving-out calibration site of the vehicle to which the vehicle-mounted camera belongs, selecting the cached first external parameters of the vehicle-mounted camera at certain intervals (such as 10 frames) for optimal estimation.
Preferably, a group of data is cached for each type of external parameter in the plurality of first external parameters of the vehicle-mounted camera, that is, a group of data is cached for each roll angle, pitch angle, course angle and translation distance in the plurality of first external parameters of the vehicle-mounted camera, and an external parameter value with the maximum probability in each group of data is obtained by adopting a kernel density estimation method, that is, the optimal estimation of the type of external parameter is obtained.
Specifically, the kernel density estimation method is to fit a gaussian function of a statistical histogram of data, find a gaussian function with the best histogram fitting effect with known data, and can resist a certain amount of noise interference. For each type of external parameter, a fitted kernel function (preferably a Gaussian kernel function) is estimated through a group of external parameter values corresponding to each type, the peak value of the kernel function is found, and the corresponding abscissa value (where the probability of the Gaussian function is maximum) is the optimal value of each type of external parameter.
Alternatively, the optimal value of each external parameter of the four types of external parameters can be estimated simultaneously by adopting a four-dimensional kernel density estimation method.
Optionally, a group of data is cached for each type of external parameter, and a median of each type of external parameter is obtained by using a median filtering method, that is, the median is the optimal value of each type of external parameter.
Optionally, a group of data is cached for each type of external parameter, an average value of each group of data is calculated by adopting an average filtering method, and the obtained average value of each group of data is the optimal value of each type of external parameter.
And when the standard deviation of the obtained optimal value of each type of external parameter is smaller than a specific threshold value, the calibration result is converged, and the calibration is stopped to obtain a second external parameter of the vehicle-mounted camera.
S1011, the external parameter of the onboard camera is updated to a second external parameter.
According to the technical scheme, the course angle of the vehicle at the current moment is compensated by the course angle of the vehicle-mounted camera in the external parameters under the world coordinate system to obtain the compensated course angle, and the course angle in the conversion relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to be the compensated course angle, so that the influence of the course angle of the vehicle can be compensated when the vehicle does not run in parallel with a lane line completely, and the accuracy of the conversion relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is improved.
Fig. 11 is a schematic structural diagram of a calibration apparatus for external parameters of a vehicle-mounted camera according to an embodiment of the present application. As shown in fig. 11, the apparatus 1100 may include an acquisition module 1101 and a processing module 1102. The apparatus 1100 may be used to implement the method shown in any of the embodiments described above.
In one possible implementation, the apparatus 1100 may be used to implement the method shown in fig. 6 described above. For example, the acquiring module 1101 is configured to implement S601, and the processing module 1102 is configured to implement S602 and S603.
In another possible implementation, the apparatus 1100 further includes a compensation module, an update module, and a cache module. The apparatus 1100 in this implementation may be used to implement the method shown in fig. 10 described above. For example, the obtaining module 1101 is configured to implement S1001, S1004, and S1006, the processing module 1102 is configured to implement S1002, S1003, S1005, and S1010, the compensating module is configured to implement S1007, the updating module is configured to implement S1008 and S1011, and the caching module is configured to implement S1009.
Fig. 12 is a schematic structural diagram of a calibration apparatus for external parameters of a vehicle-mounted camera according to another embodiment of the present application. The apparatus 1200 shown in fig. 12 may be used to execute the calibration method for external parameters of the vehicle-mounted camera shown in any one of the above embodiments.
As shown in fig. 12, the apparatus 1200 of the present embodiment includes: memory 1201, processor 1202, communication interface 1203, and bus 1204. The memory 1201, the processor 1202, and the communication interface 1203 are communicatively connected to each other through a bus 1204.
The memory 1201 may be a Read Only Memory (ROM), a static memory device, a dynamic memory device, or a Random Access Memory (RAM). The memory 1201 may store programs that, when executed by the processor 1202, are stored in the memory 1201, the processor 1202 may be configured to perform the various steps of the methods shown in fig. 6 and 10.
The processor 1202 may be a general Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits, configured to execute related programs, so as to implement the method for calibrating the external parameters of the vehicle-mounted camera according to the embodiment of the present application.
The processor 1202 may also be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the method of the embodiments of the present application may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 1202.
The processor 1202 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1201, and the processor 1202 reads the information in the memory 1201, and in combination with hardware thereof, performs functions required to be performed by each method in the embodiments of the present application, for example, may perform each step/function of the embodiments shown in fig. 6 and fig. 10.
The communication interface 1203 may use transceiver means such as, but not limited to, a transceiver to enable communication between the apparatus 1200 and other devices or communication networks.
The bus 1204 may include pathways to transfer information between various components of the apparatus 1200 (e.g., memory 1201, processor 1202, communication interface 1203).
It should be understood that the apparatus 1200 shown in the embodiment of the present application may be an electronic device, or may also be a chip configured in the electronic device.
It should be understood that the processor in the embodiments of the present application may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will also be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, and not limitation, many forms of Random Access Memory (RAM) are available, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct bus RAM (DR RAM).
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. The procedures or functions according to the embodiments of the present application are wholly or partially generated when the computer instructions or the computer program are loaded or executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more collections of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists singly, A and B exist simultaneously, and B exists singly, wherein A and B can be singular or plural. In addition, the "/" in this document generally indicates that the former and latter associated objects are in an "or" relationship, but may also indicate an "and/or" relationship, which may be understood with particular reference to the former and latter text.
In the present application, "at least one" means one or more, "a plurality" means two or more. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (17)

1. A calibration method for external parameters of a vehicle-mounted camera is characterized by comprising the following steps:
acquiring coordinates of M first parallel lines in a first image in an actual camera coordinate system of the vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle to which the vehicle belongs on a target road in a driving process, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer greater than or equal to 3, and N is an integer greater than or equal to M;
determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in an actual camera coordinate system of the vehicle-mounted camera and constraints of M second parallel lines in a bird's-eye view of the first image, wherein the M second parallel lines correspond to the M first parallel lines one by one, the constraints of the M second parallel lines in the bird's-eye view comprise a parallel constraint, a vertical constraint, a spacing proportion constraint and a spacing constraint, the parallel constraint comprises the M second parallel lines in parallel, the vertical constraint comprises any one second parallel line of the M second parallel lines is perpendicular to the x-axis direction of the virtual ideal camera coordinate system, the spacing proportion constraint comprises the distance proportion between any two second parallel lines of the M second parallel lines and the constraints between the M lane lines corresponding to the any two second parallel lines The distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between M lane lines corresponding to any two second parallel lines is minimum;
and determining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
2. The method of claim 1, wherein determining a transformation relationship between the actual camera coordinate system of the onboard camera and the virtual ideal camera coordinate system of the onboard camera according to constraints of the M first parallel lines in the actual camera coordinate system of the onboard camera and the M second parallel lines in the bird's eye view of the first image comprises:
determining an x-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to coordinates of the M first parallel lines in the actual camera coordinate system and the parallel constraint;
determining a y-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system and the vertical constraint;
determining a z-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to coordinates of the M first parallel lines in the actual camera coordinate system and the spacing proportion constraint;
and determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the spacing constraint.
3. The method of claim 1 or 2, wherein the parallel constraint further comprises: 0.5 sigma (theta)ii+1)2Minimum where thetaiIs the angle theta between the ith second parallel line of the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate systemi+1Is the included angle between the (i + 1) th second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, thetaiLess than or equal to 90 degrees, thetai+1Less than or equal to 90 degrees.
4. The method of any of claims 1 to 3, wherein the vertical constraining further comprises: 0.5 sigma (theta)j-90)2Minimum, θjIs the included angle between the jth second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, thetajLess than or equal to 90 degrees.
5. The method of any of claims 1-4, wherein the pitch ratio constraint further comprises:
Figure FDA0003646238130000021
minimum, where ωs,s+1Is the distance between the s second parallel line and the s +1 second parallel line in the M second parallel lines, omegas+1,s+2Is the distance between the (s + 1) th and (s + 2) th second parallel lines of the M second parallel lines, Ws,s+1Is the actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s + 1-th second parallel line in the M second parallel lines, Ws+1,s+2The actual distance between the lane line corresponding to the (s + 1) th second parallel line and the lane line corresponding to the (s + 2) th second parallel line in the M second parallel lines is shown, wherein s is an integer which is more than or equal to 1 and less than M-1.
6. The method of any of claims 1-5, wherein the spacing constraints further comprise 0.5 sigma (ω)i,i+1-Wi,i+1)2Minimum, ωi,i+1Is the distance between the ith and the (i + 1) th second parallel lines in the M second parallel lines, Wi,i+1And the actual distance between the lane line corresponding to the ith second parallel line and the lane line corresponding to the (i + 1) th second parallel line in the M second parallel lines is represented, wherein i is an integer which is more than or equal to 1 and less than M.
7. The method according to any one of claims 1 to 6, further comprising:
acquiring the y-axis translation amount of the vehicle-mounted camera in the world coordinate system;
acquiring positioning information of a vehicle, wherein the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle;
establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system;
acquiring a course angle of the vehicle at the current moment according to the observation model of the vehicle;
compensating the course angle of the vehicle at the current moment according to the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to obtain a compensated course angle;
and updating the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system into the compensated course angle.
8. An external parameter calibration device for a vehicle-mounted camera, the device comprising:
the vehicle-mounted camera comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring the coordinates of M first parallel lines in a first image in an actual camera coordinate system of the vehicle-mounted camera, the first image is an image obtained by shooting a vehicle to which the vehicle belongs on a target road in the driving process, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer larger than or equal to 3, and N is an integer larger than or equal to M;
a processing module, configured to determine a transformation relationship between an actual camera coordinate system of the onboard camera and a virtual ideal camera coordinate system of the onboard camera according to constraints of the M first parallel lines in an actual camera coordinate system of the onboard camera and M second parallel lines in a bird's eye view of the first image, where the M second parallel lines correspond to the M first parallel lines one to one, the constraints of the M second parallel lines in the bird's eye view include a parallel constraint, a vertical constraint, a spacing ratio constraint, and a spacing constraint, the parallel constraint includes the M second parallel lines being parallel, the vertical constraint includes any one of the M second parallel lines being perpendicular to an x-axis direction of the virtual ideal camera coordinate system, and the spacing ratio constraint includes a distance ratio between any two of the M second parallel lines being perpendicular to the M corresponding to the any two second parallel lines The distance proportion between the lane lines is the same, and the distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between M lane lines corresponding to any two second parallel lines is minimum;
the processing module is further configured to determine a transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system according to the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relationship between the virtual ideal camera coordinate system and the world coordinate system.
9. The apparatus of claim 8, wherein the processing module is specifically configured to:
determining an x-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to coordinates of the M first parallel lines in the actual camera coordinate system and the parallel constraint;
determining a y-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system and the vertical constraint;
determining a z-axis rotation angle of the virtual ideal camera coordinate system relative to an actual camera coordinate system of the vehicle-mounted camera according to coordinates of the M first parallel lines in the actual camera coordinate system and the spacing proportion constraint;
and determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the spacing constraint.
10. The apparatus of claim 8 or 9, wherein the parallel constraint further comprises: 0.5 sigma (theta)ii+1)2Minimum where thetaiThe ith second parallel line of the M second parallel lines and the x-axis of the virtual ideal camera coordinate systemAngle of direction, thetai+1Is the included angle between the (i + 1) th second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, thetaiLess than or equal to 90 degrees, thetai+1Less than or equal to 90 degrees.
11. The apparatus of any one of claims 8 to 10, wherein the vertical restraint further comprises: 0.5 sigma (theta)j-90)2Minimum, θjIs the included angle between the jth second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, thetajLess than or equal to 90 degrees.
12. The apparatus of any of claims 8 to 11, wherein the pitch ratio constraint further comprises:
Figure FDA0003646238130000031
minimum, where ωs,s+1Is the distance between the s second parallel line and the s +1 second parallel line in the M second parallel lines, omegas+1,s+2Is the distance between the (s + 1) th and (s + 2) th second parallel lines of the M second parallel lines, Ws,s+1Is the actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s + 1-th second parallel line in the M second parallel lines, Ws+1,s+2The actual distance between the lane line corresponding to the (s + 1) th second parallel line and the lane line corresponding to the (s + 2) th second parallel line in the M second parallel lines is shown, wherein s is an integer which is more than or equal to 1 and less than M-1.
13. The apparatus of any of claims 8-12, wherein the spacing constraints further comprise 0.5 sigma (ω)i,i+1-Wi,i+1)2Minimum, ωi,i+1The ith second parallel line and the (i + 1) th second parallel line in the M second parallel linesDistance between parallel lines, Wi,i+1And the actual distance between the lane line corresponding to the ith second parallel line and the lane line corresponding to the (i + 1) th second parallel line in the M second parallel lines is represented, wherein i is an integer which is more than or equal to 1 and less than M.
14. The apparatus of any one of claims 8 to 13, further comprising a compensation module to:
acquiring the y-axis translation amount of the vehicle-mounted camera in the world coordinate system;
acquiring positioning information of a vehicle, wherein the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle;
establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system;
acquiring a course angle of the vehicle at the current moment according to the observation model of the vehicle;
compensating the course angle of the vehicle at the current moment according to the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to obtain a compensated course angle;
and updating the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system into the compensated course angle.
15. A calibration device for external parameters of a vehicle-mounted camera is characterized by comprising: a memory and a processor;
the memory is to store program instructions;
the processor is configured to invoke program instructions in the memory to perform the method of any of claims 1 to 7.
16. A chip comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the at least one processor being configured to execute a computer program or instructions to perform the method of any one of claims 1 to 7.
17. A computer-readable medium, characterized in that the computer-readable medium stores program code for computer execution, the program code comprising instructions for performing the method of any of claims 1 to 7.
CN202180006501.9A 2021-08-31 2021-08-31 Calibration method for external parameters of vehicle-mounted camera and related device Pending CN114730472A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/115802 WO2023028880A1 (en) 2021-08-31 2021-08-31 External parameter calibration method for vehicle-mounted camera and related apparatus

Publications (1)

Publication Number Publication Date
CN114730472A true CN114730472A (en) 2022-07-08

Family

ID=82235994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180006501.9A Pending CN114730472A (en) 2021-08-31 2021-08-31 Calibration method for external parameters of vehicle-mounted camera and related device

Country Status (2)

Country Link
CN (1) CN114730472A (en)
WO (1) WO2023028880A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116109698A (en) * 2023-04-11 2023-05-12 禾多科技(北京)有限公司 Method, device and storage medium for determining coordinate value of target virtual parking space

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012508B (en) * 2023-03-28 2023-06-23 高德软件有限公司 Lane line rendering method, device and storage medium
CN116704040B (en) * 2023-04-03 2024-03-15 上海保隆汽车科技(武汉)有限公司 Camera calibration method, device, controller, vehicle and storage medium
CN116934847B (en) * 2023-09-15 2024-01-05 蓝思系统集成有限公司 Discharging method, discharging device, electronic equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651963B (en) * 2016-12-29 2019-04-26 清华大学苏州汽车研究院(吴江) A kind of installation parameter scaling method of the vehicle-mounted camera for driving assistance system
WO2019225681A1 (en) * 2018-05-23 2019-11-28 パナソニックIpマネジメント株式会社 Calibration device and calibration method
CN112184830B (en) * 2020-09-22 2021-07-09 深研人工智能技术(深圳)有限公司 Camera internal parameter and external parameter calibration method and device, computer equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116109698A (en) * 2023-04-11 2023-05-12 禾多科技(北京)有限公司 Method, device and storage medium for determining coordinate value of target virtual parking space

Also Published As

Publication number Publication date
WO2023028880A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
CN110163930B (en) Lane line generation method, device, equipment, system and readable storage medium
CN114730472A (en) Calibration method for external parameters of vehicle-mounted camera and related device
CN109902637B (en) Lane line detection method, lane line detection device, computer device, and storage medium
CN110147382B (en) Lane line updating method, device, equipment, system and readable storage medium
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
EP2399239B1 (en) Estimation of panoramic camera orientation relative to a vehicle coordinate frame
CN110348297B (en) Detection method, system, terminal and storage medium for identifying stereo garage
CN109849930B (en) Method and device for calculating speed of adjacent vehicle of automatic driving automobile
CN110444044B (en) Vehicle pose detection system based on ultrasonic sensor, terminal and storage medium
CN110969145B (en) Remote sensing image matching optimization method and device, electronic equipment and storage medium
CN112257698B (en) Method, device, equipment and storage medium for processing annular view parking space detection result
CN112862890B (en) Road gradient prediction method, device and storage medium
CN114120149B (en) Oblique photogrammetry building feature point extraction method and device, electronic equipment and medium
CN111295667A (en) Image stereo matching method and driving assisting device
CN114897669A (en) Labeling method and device and electronic equipment
CN115164900A (en) Omnidirectional camera based visual aided navigation method and system in urban environment
CN112219225A (en) Positioning method, system and movable platform
CN114119749A (en) Monocular 3D vehicle detection method based on dense association
CN112132902B (en) Vehicle-mounted camera external parameter adjusting method and device, electronic equipment and medium
CN110827337B (en) Method and device for determining posture of vehicle-mounted camera and electronic equipment
CN116152347A (en) Vehicle-mounted camera mounting attitude angle calibration method and system
CN113034538B (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment
CN113126117B (en) Method for determining absolute scale of SFM map and electronic equipment
CN109919998B (en) Satellite attitude determination method and device and terminal equipment
CN114076946A (en) Motion estimation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination