WO2023028880A1 - 车载相机的外部参数的标定方法及相关装置 - Google Patents

车载相机的外部参数的标定方法及相关装置 Download PDF

Info

Publication number
WO2023028880A1
WO2023028880A1 PCT/CN2021/115802 CN2021115802W WO2023028880A1 WO 2023028880 A1 WO2023028880 A1 WO 2023028880A1 CN 2021115802 W CN2021115802 W CN 2021115802W WO 2023028880 A1 WO2023028880 A1 WO 2023028880A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
vehicle
camera
parallel
camera coordinate
Prior art date
Application number
PCT/CN2021/115802
Other languages
English (en)
French (fr)
Inventor
何启盛
李涵
黄海晖
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202180006501.9A priority Critical patent/CN114730472A/zh
Priority to PCT/CN2021/115802 priority patent/WO2023028880A1/zh
Publication of WO2023028880A1 publication Critical patent/WO2023028880A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present application relates to the technical field of camera calibration, and in particular to a method for calibrating external parameters of a vehicle-mounted camera and related devices.
  • Car cameras are playing an increasingly important role as a sensor in assisted driving and autonomous driving. By correlating the environment around the vehicle with the digital images captured by the car camera, it can provide necessary information for safe driving.
  • the external parameters of the on-board camera play an important role.
  • the external parameters of the vehicle camera refer to the translation distance and rotation angle of the vehicle camera relative to the vehicle.
  • the external parameters of the vehicle camera are usually calibrated with the help of parallel lines.
  • the parallel lines used for the calibration of the external parameters of the vehicle camera can be the lane lines drawn in a specific site, or the lane lines on the road.
  • many existing technical solutions have specific constraints, for example, it is required that the distances between the three parallel lines used for extrinsic calibration of the vehicle camera must be equal.
  • obtain the coordinates of the pixel coordinate system of the three parallel lines convert the coordinates of the pixel coordinate system of the three parallel lines into the coordinates of the camera coordinate system, and use the equal distance between the three parallel lines as a constraint condition to obtain the external parameters of the vehicle camera.
  • using equidistant parallel lines to calibrate the vehicle camera makes the calibration method of the vehicle camera less flexible.
  • the present application provides a method for calibrating external parameters of a vehicle-mounted camera and a related device, which improves the flexibility of the method for calibrating external parameters of a vehicle-mounted camera.
  • the present application provides a method for calibrating external parameters of a vehicle-mounted camera, the method comprising: obtaining the coordinates of M first parallel lines in the first image in the actual camera coordinate system of the vehicle-mounted camera, so The first image is an image captured by the vehicle belonging to the vehicle-mounted camera while driving on the target road.
  • the target road contains N parallel lane lines, and any two lanes in the N lane lines
  • the distance between the lines is known, and the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer greater than or equal to 3, and N is greater than or equal to An integer equal to M; according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle camera and the constraint of the M second parallel lines in the bird's-eye view of the first image, determine the vehicle-mounted
  • the constraints of the parallel lines in the bird's-eye view include parallel constraints, vertical constraints, spacing ratio constraints and spacing constraints, the parallel constraints include the M second parallel lines parallel, and the vertical constraints include the M Any one of the second parallel lines is perpendicular to the x-axis direction of the virtual ideal camera coordinate system, and the distance ratio constraint includes any two second parallel lines among the M second parallel lines.
  • the distance ratio between the two second parallel lines is the same as the distance ratio between the M lane lines corresponding to the any two second parallel lines, and the distance constraint includes that the distance between the any two second parallel lines is the same as the distance between the any two second parallel lines.
  • the distance difference between the M lane lines corresponding to the second parallel lines is the smallest; according to the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera and the virtual
  • the transformation relationship between the ideal camera coordinate system and the world coordinate system determines the transformation relationship between the actual camera coordinate system of the vehicle camera and the world coordinate system.
  • the vehicle-mounted camera determines the vehicle-mounted camera
  • M is an integer greater than or equal to 3
  • the M second parallel lines correspond to the M first parallel lines one by one
  • the M second parallel lines The constraints of parallel lines in the bird's-eye view include parallel constraints, vertical constraints, spacing ratio constraints, and spacing constraints; and then according to the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera, and the virtual ideal camera
  • the transformation relationship between the coordinate system and the world coordinate system determines the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system, wherein the first image is taken by the vehicle to which the vehicle-mounted camera belongs while driving on the target road
  • the obtained image contains N parallel lane lines on the target road, and N is an integer greater
  • the virtual ideal camera coordinate system of the vehicle-mounted camera As the transit coordinate system, it is only necessary to know any two of the N lane lines.
  • the distance between two adjacent lane lines can be based on the coordinates of the M first parallel lines in the first image in the actual camera coordinate system of the vehicle camera and the parallelism of the M second parallel lines in the bird's-eye view of the first image.
  • Constraints, vertical constraints, spacing ratio constraints, and spacing constraints determine the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera, and do not need to set other specific parameters for the N lane lines in the target road.
  • the constraints of the vehicle camera improve the flexibility of the external parameter calibration method.
  • Constraints determining the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera, including: according to the M first parallel lines in the actual camera coordinate system of the vehicle camera The coordinates in and the parallel constraints determine the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system; according to the M first parallel lines in the actual camera coordinate system of the vehicle camera The coordinates in and the vertical constraints determine the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system; according to the M first parallel lines in the actual camera coordinates of the vehicle camera Coordinates in the system and the distance ratio constraint, determine the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system; according to the M first parallel lines in the actual camera of the vehicle camera.
  • the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and parallel constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
  • the coordinates and vertical constraints of a parallel line in the actual camera coordinate system of the vehicle camera determine the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
  • the coordinate and spacing ratio constraints in the coordinate system determine the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system, and according to the coordinate and spacing constraints of the M first parallel lines in the actual camera coordinate system of the vehicle camera, Determine the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, obtain the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera, and improve the actual camera coordinate system of the vehicle camera The accuracy of the transformation relationship with the virtual ideal camera coordinate system of the vehicle camera.
  • the parallel constraint further includes: 0.5 ⁇ ( ⁇ i - ⁇ i+1 ) 2 minimum, where ⁇ i is the i-th second parallel line among the M second parallel lines line and the x-axis direction of the virtual ideal camera coordinate system, ⁇ i +1 is the i+1 second parallel line in the M second parallel lines and the virtual ideal camera coordinate system
  • the included angle in the x-axis direction, i is an integer greater than or equal to 1 and less than M, ⁇ i is less than or equal to 90 degrees, and ⁇ i+1 is less than or equal to 90 degrees.
  • the minimum of 0.5 ⁇ ( ⁇ i - ⁇ i+1 ) 2 is used as a parallel constraint to determine the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system, which improves the relative accuracy of the virtual ideal camera coordinate system to the actual The accuracy of the x-axis rotation angle of the camera coordinate system.
  • the vertical constraint further includes: 0.5 ⁇ ( ⁇ j -90) 2 minimum, ⁇ j is the jth second parallel line among the M second parallel lines and the The included angle in the x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, and ⁇ j is less than or equal to 90 degrees.
  • the minimum of 0.5 ⁇ ( ⁇ j -90) 2 is used as the vertical constraint to determine the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system, which improves the degree of rotation of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
  • the accuracy of the y-axis rotation angle of the system is used as the vertical constraint to determine the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
  • the distance ratio constraint further includes: Minimum, wherein, ⁇ s, s+1 is the distance between the sth second parallel line and the s+1th second parallel line among the M second parallel lines, ⁇ s+1, s+ 2 is the distance between the s+1 second parallel line and the s+2 second parallel line among the M second parallel lines, W s, s+1 is the M second parallel line
  • the actual distance between the lane line corresponding to the sth second parallel line in the line and the lane line corresponding to the s+1th second parallel line, W s+1, s+2 are the M second parallel lines
  • the actual distance between the lane line corresponding to the s+1th second parallel line and the lane line corresponding to the s+2th second parallel line, s is an integer greater than or equal to 1 and less than M-1.
  • the The minimum distance ratio constraint determines the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system, and improves the accuracy of the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
  • the spacing constraint further includes 0.5 ⁇ ( ⁇ i, i+1 -W i, i+1 ) 2 minimum, ⁇ i, i+1 being the M second parallel lines
  • the distance between the i-th second parallel line and the i+1-th second parallel line, W i, i+1 is the lane corresponding to the i-th second parallel line in the M second parallel lines
  • the actual distance between the line and the lane line corresponding to the i+1th second parallel line, i is an integer greater than or equal to 1 and less than M.
  • the minimum of 0.5 ⁇ ( ⁇ i, i+1 -W i, i+1 ) 2 is used as the spacing constraint to determine the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, which improves the virtual ideal The accuracy of the z-axis translation distance of the camera coordinate system relative to the actual camera coordinate system.
  • the method further includes: acquiring the y-axis translation of the vehicle-mounted camera in the world coordinate system; acquiring vehicle positioning information, where the vehicle positioning information includes the vehicle's current The location information of the vehicle and/or the acceleration of the vehicle and/or the wheel speed of the vehicle; the vehicle's position information is established according to the positioning information of the vehicle and the y-axis translation of the vehicle camera in the world coordinate system Observation model; obtain the heading angle of the vehicle at the current moment according to the observation model of the vehicle; according to the heading angle in the transformation relationship between the actual camera coordinate system of the vehicle camera and the world coordinate system Compensate the heading angle at the time to obtain the compensated heading angle; update the heading angle in the transformation relationship between the actual camera coordinate system of the vehicle camera and the world coordinate system to the compensated heading angle.
  • the heading angle of the vehicle at the current moment is obtained according to the vehicle observation model established by the vehicle’s positioning information and the y-axis translation of the on-board camera in the world coordinate system, and the external parameters of the on-board camera in the world coordinate system are used Compensate the heading angle of the vehicle at the current moment to obtain the compensated heading angle, and update the heading angle in the transformation relationship between the actual camera coordinate system of the vehicle camera and the world coordinate system to the compensated heading angle,
  • the influence of the heading angle of the vehicle can be compensated, and the accuracy of the transformation relationship between the actual camera coordinate system of the vehicle camera and the world coordinate system is improved.
  • the present application provides a device for calibrating external parameters of a vehicle-mounted camera.
  • the device may include various modules for implementing the method in the first aspect, and these modules may be implemented by means of software and/or hardware.
  • the present application provides a device for calibrating external parameters of a vehicle-mounted camera.
  • the apparatus can include a processor coupled with a memory.
  • the memory is used to store program codes
  • the processor is used to execute the program codes in the memory, so as to implement the method in the first aspect or any one of the implementation manners.
  • the device may also include the memory.
  • the present application provides a chip, including at least one processor and a communication interface, the communication interface and the at least one processor are interconnected through lines, and the at least one processor is used to run computer programs or instructions to execute The method described in the first aspect or any one of the possible implementation manners.
  • the present application provides a computer-readable medium, where the computer-readable medium stores program code for execution by a device, and the program code includes a program code for executing the program described in the first aspect or any one of the possible implementation manners.
  • the present application provides a computer program product containing instructions.
  • the computer program product When the computer program product is run on a computer, it causes the computer to execute the method described in the first aspect or any one of the possible implementation modes.
  • the present application provides a computing device, including at least one processor and a communication interface, the communication interface and the at least one processor are interconnected through lines, the communication interface communicates with the target system, and the at least one processing
  • the device is used to execute computer programs or instructions, so as to execute the method described in the first aspect or any one of the possible implementation manners.
  • the present application provides a computing system, including at least one processor and a communication interface, the communication interface and the at least one processor are interconnected through lines, the communication interface communicates with the target system, and the at least one processing
  • the device is used to execute computer programs or instructions, so as to execute the method described in the first aspect or any one of the possible implementation manners.
  • Fig. 1 is a schematic diagram of a pixel coordinate system
  • Fig. 2 is a schematic diagram of a camera coordinate system
  • FIG. 3 is a schematic diagram of an actual camera coordinate system and a virtual ideal camera coordinate system according to an embodiment of the present application;
  • FIG. 4 is a schematic diagram of a world coordinate system according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an application scenario of an embodiment of the present application.
  • FIG. 6 is a schematic flowchart of a method for calibrating external parameters of a vehicle-mounted camera according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of a first image according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a bird's-eye view of an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a course angle compensation according to an embodiment of the present application.
  • FIG. 10 is a schematic flowchart of another method for calibrating external parameters of a vehicle-mounted camera according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a calibration device for external parameters of a vehicle-mounted camera according to an embodiment of the present application.
  • Fig. 12 is a schematic structural diagram of a device for calibrating external parameters of a vehicle-mounted camera according to another embodiment of the present application.
  • FIG. 1 is a schematic diagram of a pixel coordinate system. As shown in FIG. 1 , the vertex at the upper left corner of the pixel coordinate system is the origin O p , the u axis is horizontally to the right, and the v axis is vertically downward.
  • Pixel coordinates refer to the position of a pixel in an image.
  • the coordinates of any pixel point can be expressed as (u i , v i ).
  • the pixel representation does not reflect the physical size of the objects in the image.
  • FIG. 2 is a schematic diagram of a camera coordinate system.
  • the camera coordinate system uses the optical axis of the camera as the Zc axis, and the center of the light in the camera optical system is the origin Oc , which is actually the center of the lens.
  • the horizontal axis X c and the vertical axis Y c of the camera coordinate system are respectively parallel to the u axis and the v axis of the pixel coordinate system.
  • FIG. 3 is a schematic diagram of an actual camera coordinate system and a virtual ideal camera coordinate system according to an embodiment of the present application.
  • L1, L2, and L3 are three parallel lines located on the road where the vehicle is driving.
  • the center of the camera lens is taken as the origin O r of the actual camera coordinate system;
  • the optical axis of the camera is selected as the Z r of the actual camera coordinate system, and Z r is parallel to the vehicle driving road, and the front of the camera is the positive direction of Z r ;
  • the direction vertical and parallel to the road surface of the vehicle is Xr of the actual camera coordinate system, the direction from L2 to L3 is selected as the positive direction of Xr , and the direction perpendicular to the road surface of the vehicle is selected as Yr (not shown in the figure),
  • the positive direction of Y r is chosen to be the inward direction perpendicular to the driving road of the vehicle.
  • the origin of the virtual ideal camera coordinate system coincides with the origin O r of the actual camera coordinate system, and the y-axis of the virtual ideal camera coordinate system coincides with Y r , and the directions are consistent.
  • FIG. 4 is a schematic diagram of a world coordinate system according to an embodiment of the present application.
  • L1, L2, and L3 are three parallel lines on the road where the car is driving
  • ZOX is the virtual ideal camera coordinate system of the camera
  • any point on L2 is selected as the origin O w of the world coordinate system
  • the world coordinate system The direction of the X w of the camera is consistent with the Z-axis direction of the virtual ideal camera coordinate system of the camera.
  • the Y w of the world coordinate system is perpendicular to L2 and parallel to the driving road surface of the car.
  • the direction from L2 to L1 is selected as the positive direction of Y w .
  • FIG. 5 is a schematic diagram of an application scenario of an embodiment of the present application.
  • the scene shown in FIG. 5 is a scene in which the external parameters of the camera are calibrated by using three parallel lines L1 , L2 and L3 with known distances on the driving road. By driving the vehicle from one end of the three parallel lines to the other end, the calibration of the external parameters of the camera is completed.
  • the three parallel lines L1 , L2 and L3 with known spacing on the driving road may be lane lines drawn on a specific site, or lane lines on normal roads.
  • the scene shown in FIG. 5 is only an example, and the technical solution of the present application can also be applied to other scenes, as long as the scene involves the calibration of external parameters of the camera.
  • the technical solution of the present application can also be applied to scenarios such as calibration of external parameters of cameras in intelligent robots.
  • FIG. 6 is a schematic flowchart of a method for calibrating external parameters of a vehicle-mounted camera according to an embodiment of the present application. As shown in FIG. 6, the method at least includes S601 to S603.
  • the first image is an image captured by the vehicle to which the vehicle-mounted camera belongs while driving on the target road.
  • the target road Contains parallel N lane lines, the distance between any two lane lines in the N lane lines is known, and the M first parallel lines correspond to the M first lane lines in the N lane lines , M is an integer greater than or equal to 3, and N is an integer greater than or equal to M.
  • the vehicle-mounted camera captures M lane lines among the N lane lines on the target road to obtain the first image.
  • the first image contains M first parallel lines, and the M first parallel lines are obtained by the vehicle-mounted camera on the M lane lines in the N lane lines on the target road.
  • the M first parallel lines in the first image The parallel lines are in one-to-one correspondence with the M lane lines among the N lane lines on the target road.
  • FIG. 7 is a schematic diagram of a first image according to an embodiment of the present application.
  • first parallel lines L1, L2 and L3 in the first image, which correspond to 3 lane lines in the N lane lines in the target road respectively.
  • M is equal to 3
  • N is greater than or equal to 3.
  • Extract the M first parallel lines in the first image obtain the coordinates of the M first parallel lines in the pixel coordinate system, and convert the coordinates of the M parallel lines in the pixel coordinate system into M parallel lines in the vehicle Coordinates in the camera's camera coordinate system.
  • An example of the pixel coordinate system may be the pixel coordinate system shown in FIG. 1
  • another example of the camera coordinate system may be the camera coordinate system of an actual camera shown in FIG. 3 .
  • a segmentation algorithm is used to extract parallel line regions and contours in the first image, and then sub-pixel level edge extraction is performed to obtain M first parallel lines.
  • segmentation algorithms include watershed algorithms and the like.
  • Hough transform is used to extract straight lines in the first image, and methods such as clustering and filtering are used to obtain M first parallel lines in the first image.
  • the Hough transform is a feature extraction technology in image processing, which can detect objects with specific shapes through voting algorithms; the clustering method uses the straight line segments with similar slopes and intercepts to cluster; the filtering method is through the camera Refer to the installation angle and position to set the region of interest for screening.
  • the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera is the x-axis rotation angle, y-axis rotation angle, The z-axis rotation angle and the z-axis translation distance.
  • An example of the actual camera coordinate system of the vehicle camera may be the actual camera coordinate system shown in FIG. 3
  • an example of the virtual ideal camera coordinate system of the vehicle camera may be the virtual ideal camera coordinate system shown in FIG. 3 .
  • an inverse perspective transformation method is used to obtain the bird's-eye view of the M first parallel lines.
  • M second parallel lines are in the bird's eye view, and the M second parallel lines are in one-to-one correspondence with the M first parallel lines.
  • FIG. 8 is a schematic diagram of a bird's-eye view of an embodiment of the present application.
  • M is equal to 3, which are L1, L2 and L3 respectively, where ⁇ 1 is the gap between the second parallel line L1 and the x-axis direction of the virtual ideal coordinate system angle, ⁇ 2 is the angle between the second parallel line L2 and the x-axis direction of the virtual ideal coordinate system, ⁇ 3 is the angle between the second parallel line L3 and the x-axis direction of the virtual ideal coordinate system, and ⁇ 12 is the second parallel line
  • the distance between the line L1 and the second parallel line L2, ⁇ 23 is the distance between the second parallel line L2 and the second parallel line L3.
  • the parallel constraint includes M second parallel lines parallel
  • the vertical constraint includes any one of the M second parallel lines perpendicular to the x-axis direction of the virtual ideal camera coordinate system
  • the spacing ratio constraint includes M second parallel lines
  • the distance ratio between any two second parallel lines in the line is the same as the distance ratio between the M lane lines corresponding to any two second parallel lines
  • the distance constraint includes the distance between any two second parallel lines
  • the distance difference between the M lane lines corresponding to any two second parallel lines is the smallest.
  • the optimization order of the x-axis rotation angle, y-axis rotation angle, z-axis rotation angle, and z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is used to optimize the vehicle-mounted camera in sequence.
  • the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined through optimization and other methods.
  • the parallel constraint includes making the loss function 0.5 ⁇ ( ⁇ i - ⁇ i+1 ) 2 the minimum, ⁇ i being the i-th second parallel line among the M second parallel lines and the coordinates of the virtual ideal camera
  • ⁇ i is the included angle between the i+1th second parallel line in the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system
  • i is an integer greater than or equal to 1 and less than M
  • ⁇ i is less than or equal to 90 degrees
  • ⁇ i+1 is less than or equal to 90 degrees.
  • the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined by means of optimization or the like.
  • the vertical constraint includes minimizing the loss function 0.5 ⁇ ( ⁇ j ⁇ 90) 2 , where ⁇ j is the distance between the jth second parallel line among the M second parallel lines and the virtual ideal camera coordinate system The included angle in the x-axis direction, j is an integer greater than or equal to 1 and less than or equal to M, and ⁇ j is less than or equal to 90 degrees.
  • the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined by optimization and other methods.
  • the distance scale constraint includes making the loss function Minimum, wherein, ⁇ s, s+1 is the distance between the sth second parallel line and the s+1th second parallel line among the M second parallel lines, and ⁇ s+1, s+2 is The distance between the s+1th second parallel line and the s+2th second parallel line among the M second parallel lines, W s, s+1 is the sth second parallel line among the M second parallel lines.
  • W s, s+1 is the sth second parallel line among the M second parallel lines
  • W s+1, s+2 is the second of the s+1th parallel line in the M second parallel lines
  • the actual distance between the lane line corresponding to the parallel line and the lane line corresponding to the s+2th second parallel line, s is an integer greater than or equal to 1 and less than M-1.
  • the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined by optimization and other methods.
  • the spacing constraint further includes a minimum of 0.5 ⁇ ( ⁇ i,i+1 -W i,i+1 ) 2 , ⁇ i,i+1 is the i-th of the M second parallel lines The distance between the second parallel line and the i+1th second parallel line, W i, i+1 is the lane line corresponding to the i-th second parallel line among the M second parallel lines and the i+th The actual distance between lane lines corresponding to one second parallel line, i is an integer greater than or equal to 1 and less than M.
  • the constraint of the M second parallel lines in the bird's-eye view includes making the loss function ⁇ ( ⁇ n -90) 2 + ⁇ ( ⁇ m,m+1 -W m,m +1 ) 2 is the smallest, where ⁇ and ⁇ are weight parameters, ⁇ n is the angle between the nth second parallel line of the M second parallel lines and the x-axis direction of the virtual ideal camera coordinate system, ⁇ m, m+ 1 is the distance between the mth second parallel line among the M second parallel lines and the m+1th second parallel line, W m, m+1 is the mth second parallel line among the M second parallel lines The actual distance between the lane line corresponding to the second parallel line and the lane line corresponding to the m+1 second parallel line, n is an integer greater than or equal to 1 and less than or equal to M, m is greater than or equal to 1 and less than M an integer of .
  • the x-axis rotation angle, y-axis rotation angle, z-axis rotation angle, and z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system can be obtained simultaneously by combining optimization and other methods, but it is easy Due to the improper selection of the initial value, only the local optimal value of the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera is obtained, rather than the global optimal value.
  • the transformation relationship between the virtual ideal camera coordinate system and the world coordinate system is the x-axis rotation angle, y-axis rotation angle, z-axis rotation angle and z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle camera.
  • the transformation relationship between the actual camera coordinate system of the vehicle camera and the world coordinate system is the external parameters of the vehicle camera, including the x-axis rotation angle, y-axis rotation angle, and z-axis rotation angle of the world coordinate system relative to the vehicle camera's actual camera coordinate system.
  • the rotation angle and the z-axis translation distance can also be called roll angle (roll), pitch angle (pitch), yaw angle (yaw) and translation distance.
  • an example of the actual camera coordinate system of the vehicle-mounted camera can be the actual camera coordinate system of the vehicle-mounted camera shown in Figure 3, and an example of the virtual ideal camera coordinate system of the vehicle-mounted camera can be the one shown in Figure 3 or Figure 4
  • An example of the virtual ideal camera coordinate system of the vehicle camera, the world coordinate system may be the world coordinate system shown in FIG. 4 .
  • the transformation matrix Transform the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera into the transformation relationship between the virtual ideal camera coordinate system and the world coordinate system, and obtain the actual camera coordinate system and world coordinates of the vehicle camera
  • the transformation relationship between systems that is, the external parameters of the vehicle camera.
  • the sensor of the vehicle to which the vehicle-mounted camera belongs includes a wheel speedometer, an inertial sensor (inertial measurement unit, IMU) and a global positioning system (global positioning system, GPS), etc.
  • the positioning information of the vehicle includes the current position information and /or the acceleration of the vehicle and/or the wheel speed of the vehicle, etc.
  • FIG. 9 is a schematic diagram of heading angle compensation according to an embodiment of the present application.
  • the heading angle of the vehicle is the heading angle of the vehicle at the current moment obtained according to the observation model of the vehicle, and the heading angle of the camera is determined
  • the heading angle in the external parameters of the on-board camera, the heading angle of the vehicle is subtracted from the heading angle of the camera to obtain the heading angle from the camera to the vehicle, which can also be called the compensated heading angle
  • the determined heading angle in the on-board camera is updated
  • the compensated heading angle the influence of the vehicle heading angle obtained by vehicle positioning is compensated.
  • multiple external parameters of the vehicle-mounted camera after course angle compensation are cached, and multiple external parameters of the vehicle-mounted camera after course angle compensation are completed are filtered to obtain the filtered vehicle-mounted camera.
  • Extrinsic parameters update the extrinsic parameters of the on-board camera to the filtered extrinsic parameters of the on-board camera.
  • multiple external parameters of the vehicle-mounted camera after course angle compensation are cached, and periodic optimal estimation is performed. If the external parameters of the cached vehicle-mounted camera after course angle compensation are not more than the preset threshold when the vehicle drives out of the calibration site, then the cached external parameters of the vehicle-mounted camera after course-angle compensation are used for optimal estimation, and Complete the calibration; if the external parameters of the cached vehicle camera after course angle compensation are over the preset threshold when the vehicle drives out of the calibration site, select a certain number of cached vehicle cameras after course angle compensation at intervals (such as 10 frames) Optimal estimation of the external parameters.
  • the calibration site is the area where N lane lines in the target road for extrinsic calibration of the vehicle camera are located.
  • a set of data is respectively cached for each type of external parameter in the external parameters of the vehicle-mounted camera after the course angle compensation is completed, that is, roll angle, pitch angle, heading
  • a set of data is buffered for angle and translation distance respectively, and the kernel density estimation method is used to obtain the extrinsic parameter value with the highest probability in each set of data, which is the optimal estimate of this type of extrinsic parameter.
  • the kernel density estimation method is to fit a Gaussian function of the statistical histogram of the data, and find a Gaussian function with the best fitting effect to the histogram of the known data, which can resist a certain amount of noise interference.
  • the fitted kernel function (preferred Gaussian kernel function) is estimated, and the peak value of the kernel function is found, and the corresponding abscissa value (Gaussian function has the highest probability ) is the optimal value of each type of extrinsic parameter.
  • the optimal value of each type of the four types of external parameters can be estimated simultaneously by using the four-dimensional kernel density estimation method.
  • a set of data is separately cached for each type of external parameter, and a median filtering method is used to obtain the median of each type of external parameter, which is the optimal value of each type of external parameter.
  • a set of data is buffered for each type of external parameter, and the average value of each set of data is calculated by using the mean filtering method, and the obtained average value of each set of data is the optimal value of each type of external parameter.
  • the calibration result converges, the calibration is stopped, and the external parameters of the vehicle camera are obtained.
  • FIG. 10 is a schematic flowchart of another method for calibrating external parameters of a vehicle-mounted camera according to an embodiment of the present application. As shown in Fig. 10, the method at least includes S1001 to S1011.
  • the first image is an image captured by the vehicle to which the vehicle-mounted camera belongs while driving on the target road.
  • the target road Contains parallel N lane lines, the distance between any two lane lines in the N lane lines is known, and the M first parallel lines correspond to the M first lane lines in the N lane lines , M is an integer greater than or equal to 3, and N is an integer greater than or equal to M.
  • the transformation relationship between the virtual ideal camera coordinate system and the world coordinate system is the x-axis rotation angle, y-axis rotation angle, z-axis rotation angle and z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle camera.
  • the transformation relationship between the actual camera coordinate system of the vehicle camera and the world coordinate system is the external parameters of the vehicle camera, including the x-axis rotation angle, y-axis rotation angle, and z-axis rotation angle of the world coordinate system relative to the vehicle camera's actual camera coordinate system.
  • the rotation angle and the z-axis translation distance can also be called roll angle (roll), pitch angle (pitch), yaw angle (yaw) and translation distance.
  • an example of the actual camera coordinate system of the vehicle-mounted camera can be the actual camera coordinate system of the vehicle-mounted camera shown in Figure 3, and an example of the virtual ideal camera coordinate system of the vehicle-mounted camera can be the one shown in Figure 3 or Figure 4
  • An example of the virtual ideal camera coordinate system of the vehicle camera, the world coordinate system may be the world coordinate system shown in FIG. 4 .
  • the transformation matrix Transform the transformation relationship between the actual camera coordinate system of the vehicle camera and the virtual ideal camera coordinate system of the vehicle camera into the transformation relationship between the virtual ideal camera coordinate system and the world coordinate system, and obtain the actual camera coordinate system and world coordinates of the vehicle camera
  • the first transformation relationship between systems that is, the first external parameter of the vehicle-mounted camera.
  • the position of the vehicle camera on the Y w axis of the world coordinate system is the y-axis translation of the vehicle camera in the world coordinate system.
  • the positioning information of the vehicle is obtained from the sensor of the vehicle to which the on-board camera belongs, wherein the sensor of the vehicle to which the on-board camera belongs includes a wheel speedometer, IMU and GPS, etc., correspondingly, the positioning information of the vehicle Including the current position information of the vehicle and/or the acceleration of the vehicle and/or the wheel speed of the vehicle, etc.
  • the observation model of the vehicle is filtered and updated, and the heading angle of the vehicle at the current moment is obtained.
  • the vehicle course angle is the course angle of the vehicle at the current moment obtained according to the observation model of the vehicle
  • the camera course angle is the determined external parameter of the on-board camera In the heading angle, subtract the camera heading angle from the vehicle heading angle to get the heading angle from the camera to the vehicle, which can also be called the compensated heading angle, and update the determined heading angle in the vehicle camera to the compensated heading angle , which compensates for the effect of vehicle heading angle obtained from vehicle localization.
  • the first external parameters of the vehicle-mounted camera include a first roll angle, a first pitch angle, a first heading angle, and a first translation distance. Update the first heading angle in the first external parameters of the vehicle-mounted camera to the compensated heading angle, and the updated first external parameters of the vehicle-mounted camera include the first roll angle, the first pitch angle, the compensated heading angle and the first translation distance.
  • the vehicle-mounted camera captures multiple first images, and according to multiple Multiple first external parameters of the vehicle-mounted camera are obtained from the first image, and the multiple first external parameters of the vehicle-mounted camera are cached.
  • the calibration site is the area where the N lane lines in the target road for extrinsic calibration of the vehicle camera are located.
  • periodic optimal estimation is performed on multiple first external parameters of the cached vehicle-mounted camera.
  • the cached multiple first external parameters of the vehicle-mounted camera All external parameters are used for optimal estimation and calibration is completed; if the number of multiple first external parameters of the cached vehicle camera exceeds the preset threshold during the period from when the vehicle to which the vehicle-mounted camera belongs is driving into the calibration field to driving out of the calibration field , then select the first extrinsic parameters of the on-board camera buffered at a certain interval (for example, 10 frames) for optimal estimation.
  • a set of data is respectively cached for each type of external parameter in the multiple first external parameters of the vehicle-mounted camera, that is, roll angle, pitch angle, heading angle and translation distance among the multiple first external parameters of the vehicle-mounted camera Cache a set of data separately, and use the kernel density estimation method to obtain the extrinsic parameter value with the highest probability in each set of data, which is the optimal estimate of this type of extrinsic parameter.
  • the kernel density estimation method is to fit a Gaussian function of the statistical histogram of the data, and find a Gaussian function with the best fitting effect to the histogram of the known data, which can resist a certain amount of noise interference.
  • the fitted kernel function (preferred Gaussian kernel function) is estimated, and the peak value of the kernel function is found, and the corresponding abscissa value (Gaussian function has the highest probability ) is the optimal value of each type of extrinsic parameter.
  • the optimal value of each type of the four types of external parameters can be estimated simultaneously by using the four-dimensional kernel density estimation method.
  • a set of data is separately cached for each type of external parameter, and a median filtering method is used to obtain the median of each type of external parameter, which is the optimal value of each type of external parameter.
  • a set of data is buffered for each type of external parameter, and the average value of each set of data is calculated by using the mean filtering method, and the obtained average value of each set of data is the optimal value of each type of external parameter.
  • the calibration result converges, the calibration is stopped, and the second external parameter of the vehicle-mounted camera is obtained.
  • the heading angle in the external parameters of the vehicle-mounted camera in the world coordinate system is used to compensate the heading angle of the vehicle at the current moment, and the compensated heading angle is obtained, and the actual camera coordinate system of the vehicle-mounted camera is compared with The heading angle in the transformation relationship between the world coordinate systems is updated to the compensated heading angle, so that when the vehicle is not completely parallel to the lane line, the influence of the vehicle heading angle can be compensated, and the actual camera coordinate system of the vehicle camera is improved.
  • FIG. 11 is a schematic structural diagram of a device for calibrating external parameters of a vehicle-mounted camera according to an embodiment of the present application.
  • an apparatus 1100 may include an acquisition module 1101 and a processing module 1102 .
  • the apparatus 1100 may be used to implement the method shown in any one of the foregoing embodiments.
  • the apparatus 1100 may be used to implement the method shown in FIG. 6 above.
  • the acquisition module 1101 is used to implement S601
  • the processing module 1102 is used to implement S602 and S603.
  • the apparatus 1100 further includes a compensation module, an update module, and a cache module.
  • the apparatus 1100 in this implementation manner may be used to implement the method shown in FIG. 10 above.
  • the acquisition module 1101 is used to realize S1001, S1004 and S1006, the processing module 1102 is used to realize S1002, S1003, S1005 and S1010, the compensation module is used to realize S1007, the update module is used to realize S1008 and S1011, and the cache module is used to realize S1009 .
  • Fig. 12 is a schematic structural diagram of a device for calibrating external parameters of a vehicle-mounted camera according to another embodiment of the present application.
  • the device 1200 shown in FIG. 12 can be used to execute the calibration method of the external parameters of the vehicle-mounted camera shown in any one of the above-mentioned embodiments.
  • the apparatus 1200 of this embodiment includes: a memory 1201 , a processor 1202 , a communication interface 1203 and a bus 1204 .
  • the memory 1201 , the processor 1202 , and the communication interface 1203 are connected to each other through a bus 1204 .
  • the memory 1201 may be a read only memory (read only memory, ROM), a static storage device, a dynamic storage device or a random access memory (random access memory, RAM).
  • the memory 1201 may store programs, and when the programs stored in the memory 1201 are executed by the processor 1202, the processor 1202 may be used to execute various steps of the methods shown in FIG. 6 and FIG. 10 .
  • the processor 1202 may adopt a general-purpose central processing unit (central processing unit, CPU), a microprocessor, an application specific integrated circuit (application specific integrated circuit, ASIC), or one or more integrated circuits, for executing related programs to A method for calibrating the external parameters of the vehicle-mounted camera according to the method embodiment of the present application is realized.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • the processor 1202 may also be an integrated circuit chip with signal processing capabilities.
  • each step of the method in each embodiment of the present application may be implemented by an integrated logic circuit of hardware in the processor 1202 or instructions in the form of software.
  • the above-mentioned processor 1202 can also be a general-purpose processor, a digital signal processor (digital signal processing, DSP), an application-specific integrated circuit (ASIC), a ready-made programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, Discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processing
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
  • the storage medium is located in the memory 1201, and the processor 1202 reads the information in the memory 1201, and combines its hardware to complete the functions required by the various methods in the embodiments of the present application.
  • the various embodiments shown in Figure 6 and Figure 10 can be executed step/function.
  • the communication interface 1203 may use, but is not limited to, a transceiver device such as a transceiver to implement communication between the device 1200 and other devices or communication networks.
  • the bus 1204 may include a pathway for transferring information between various components of the device 1200 (eg, memory 1201 , processor 1202 , communication interface 1203 ).
  • the apparatus 1200 shown in the embodiment of the present application may be an electronic device, or may also be a chip configured in the electronic device.
  • the processor in the embodiment of the present application may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory in the embodiments of the present application may be a volatile memory or a nonvolatile memory, or may include both volatile and nonvolatile memories.
  • the non-volatile memory can be read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically programmable Erases programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory can be random access memory (RAM), which acts as external cache memory.
  • RAM random access memory
  • static random access memory static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory Access memory
  • SDRAM synchronous dynamic random access memory
  • double data rate synchronous dynamic random access memory double data rate SDRAM, DDR SDRAM
  • enhanced synchronous dynamic random access memory enhanced SDRAM, ESDRAM
  • serial link DRAM SLDRAM
  • direct memory bus random access memory direct rambus RAM, DR RAM
  • the above-mentioned embodiments may be implemented in whole or in part by software, hardware, firmware or other arbitrary combinations.
  • the above-described embodiments may be implemented in whole or in part in the form of computer program products.
  • the computer program product comprises one or more computer instructions or computer programs.
  • the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server or data center by wired (such as infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center that includes one or more sets of available media.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media.
  • the semiconductor medium may be a solid state drive.
  • At least one means one or more, and “multiple” means two or more.
  • At least one of the following" or similar expressions refer to any combination of these items, including any combination of single or plural items.
  • at least one item (piece) of a, b, or c can represent: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • sequence numbers of the above-mentioned processes do not mean the order of execution, and the execution order of the processes should be determined by their functions and internal logic, and should not be used in the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed systems, devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions described above are realized in the form of software function units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: various media capable of storing program codes such as U disk, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

本申请公开了相机标定技术领域中一种车载相机的外部参数的标定方法及相关装置。本申请提供的技术方案中,根据获取的第一图像中的M条第一平行线在车载相机的实际相机坐标系中的坐标和第一图像的鸟瞰图中M条第二平行线的约束,确定车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系,M为大于或等于3的整数,M条第二平行线与M条第一平行线一一对应; 根据车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系以及虚拟理想相机坐标系与世界坐标系之间的变换关系,确定车载相机的实际相机坐标系与世界坐标系之间的变换关系,提高了车载相机的外参标定方法的灵活性。

Description

车载相机的外部参数的标定方法及相关装置 技术领域
本申请涉及相机标定技术领域,尤其涉及一种车载相机的外部参数的标定方法及相关装置。
背景技术
车载相机作为一种传感器在辅助驾驶和自动驾驶中所起的作用日益重要,通过将车辆周围的环境与车载相机拍摄到的数字图像相关联,可以为安全驾驶提供必要信息。在将车辆周围的环境与车载相机拍摄到的数字图像相关联的过程中,车载相机的外部参数起到了重要作用。其中,车载相机的外部参数是指车载相机相对于车辆的平移距离和旋转角度。
目前,通常借助平行线对车载相机的外部参数进行标定,用于车载相机外参标定的平行线可以是在特定场地中画出的车道线,也可以是道路上的车道线。在标定过程中,很多现有技术方案都设置有特定的约束条件,比如,要求用于车载相机外参标定的三条平行线之间的距离要相等。具体地,获取三条平行线的像素坐标系坐标,将三条平行线的像素坐标系坐标转换为相机坐标系坐标,并将三条平行线间的距离相等作为约束条件,得到车载相机的外部参数。但是,使用等距的平行线进行车载相机的标定使得车载相机的标定方法的灵活性较低。
因此,如何提高车载相机的外参标定方法的灵活性成为了亟待解决的问题。
发明内容
本申请提供了一种车载相机的外部参数的标定方法及相关装置,提高了车载相机的外参标定方法的灵活性。
第一方面,本申请提供一种车载相机的外部参数的标定方法,所述方法包括:获取第一图像中的M条第一平行线在所述车载相机的实际相机坐标系中的坐标,所述第一图像为所述车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,所述目标道路上包含平行的N条车道线,所述N条车道线中的任意两条车道线之间的距离是已知的,所述M条第一平行线与所述N条车道线中的M条第一车道线一一对应,M为大于或等于3的整数,N为大于或等于M的整数;根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,所述M条第二平行线与所述M条第一平行线一一对应,所述M条第二平行线在所述鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约束,所述平行约束包括所述M条第二平行线平行,所述竖直约束包括所述M条第二平行线中的任意一条第二平行线与所述虚拟理想相机坐标系的x轴方向垂直,所述间距比例约束包括所述M条第二平行线中的任意两个第二平行线之间的距离比例与所述 任意两个第二平行线对应的M条车道线之间的距离比例相同,所述距离约束包括所述任意两个第二平行线之间的距离与所述任意两个第二平行线对应的M条车道线之间的距离的差值最小;根据所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系以及所述虚拟理想相机坐标系与世界坐标系之间的变换关系,确定所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系。
本方法中,先根据获取的第一图像中的M条第一平行线在车载相机的实际相机坐标系中的坐标和第一图像的鸟瞰图中M条第二平行线的约束,确定车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系,M为大于或等于3的整数,M条第二平行线与M条第一平行线一一对应,M条第二平行线在鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约束;再根据车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系以及虚拟理想相机坐标系与世界坐标系之间的变换关系,确定车载相机的实际相机坐标系与世界坐标系之间的变换关系,其中,第一图像为车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,目标道路上包含平行的N条车道线,N为大于或等于M的整数,通过引入车载相机的虚拟理想相机坐标系作为中转坐标系,只需已知N条车道线中任意两条相邻的车道线的距离,即可根据第一图像中的M条第一平行线在车载相机的实际相机坐标系中的坐标和第一图像的鸟瞰图中M条第二平行线的平行约束、竖直约束、间距比例约束和间距约束,确定车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系,不需要对目标道路中的N条车道线设置其它特定的约束条件,提高了车载相机的外参标定方法的灵活性。
在一种可能的实现方式中,所述根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,包括:根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述平行约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的x轴旋转角;根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述竖直约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的y轴旋转角;根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距比例约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴旋转角;根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴平移距离。
该实现方式中,依次根据M条第一平行线在车载相机的实际相机坐标系中的坐标和平行约束,确定虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角,根据M条第一平行线在车载相机的实际相机坐标系中的坐标和竖直约束,确定虚拟理想相机坐标系相对于实际相机坐标系的y轴旋转角,根据M条第一平行线在车载相机的实际相机坐标系中的坐标和间距比例约束,确定虚拟理想相机坐标系相对于实际相机坐标系的z轴旋转角,根据M条第一平行线在车载相机的实际相机坐标系中的坐标和间距约束,确定虚拟理想相机坐标系相对于实际相机坐标系的z轴平移距离,得到车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系,提高了车载 相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系的准确度。
在一种可能的实现方式中,所述平行约束还包括:0.5∑(θ ii+1) 2最小,其中,θ i为所述M条第二平行线中第i条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,θ i+1为所述M条第二平行线中的第i+1条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,i为大于或等于1且小于M的整数,θ i小于或等于90度,θ i+1小于或等于90度。
该实现方式中,将0.5∑(θ ii+1) 2最小作为平行约束确定虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角,提高了虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角的准确度。
在一种可能的实现方式中,所述竖直约束还包括:0.5∑(θ j-90) 2最小,θ j为所述M条第二平行线中第j条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,j为大于或等于1且小于或等于M的整数,θ j小于或等于90度。
该实现方式中,将0.5∑(θ j-90) 2最小作为竖直约束确定虚拟理想相机坐标系相对于实际相机坐标系的y轴旋转角,提高了虚拟理想相机坐标系相对于实际相机坐标系的y轴旋转角的准确度。
在一种可能的实现方式中,所述间距比例约束还包括:
Figure PCTCN2021115802-appb-000001
最小,其中,ω s,s+1为所述M条第二平行线中的第s条第二平行线与第s+1条第二平行线之间的距离,ω s+1,s+2为所述M条第二平行线中的第s+1条第二平行线与第s+2条第二平行线之间的距离,W s,s+1为所述M条第二平行线中第s条第二平行线对应的车道线与第s+1条第二平行线对应的车道线之间的实际距离,W s+1,s+2为所述M条第二平行线中第s+1条第二平行线对应的车道线与第s+2条第二平行线对应的车道线之间的实际距离,s为大于或等于1且小于M-1的整数。
该实现方式中,将
Figure PCTCN2021115802-appb-000002
最小作为间距比例约束确定虚拟理想相机坐标系相对于实际相机坐标系的z轴旋转角,提高了虚拟理想相机坐标系相对于实际相机坐标系的z轴旋转角的准确度。
在一种可能的实现方式中,所述间距约束还包括0.5∑(ω i,i+1-W i,i+1) 2最小,ω i,i+1为所述M条第二平行线中的第i条第二平行线与第i+1条第二平行线之间的距离,W i,i+1为所述M条第二平行线中第i条第二平行线对应的车道线与第i+1条第二平行线对应的车道线之间的实际距离,i为大于或等于1且小于M的整数。
该实现方式中,将0.5∑(ω i,i+1-W i,i+1) 2最小作为间距约束确定虚拟理想相机坐标系相对于实际相机坐标系的z轴平移距离,提高了虚拟理想相机坐标系相对于实际相机坐标系的z轴平移距离的准确度。
在一种可能的实现方式中,所述方法还包括:获取所述车载相机在所述世界坐标系中的y轴平移量;获取车辆的定位信息,所述车辆的定位信息包括所述车辆当前的位置信息和/或所述车辆的加速度和/或所述车辆的轮速;根据所述车辆的定位信息和所述车载相机在所述世界坐标系中的y轴平移量建立所述车辆的观测模型;根据所述车 辆的观测模型获取所述车辆当前时刻的航向角;根据所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角对所述车辆当前时刻的航向角进行补偿,得到补偿后的航向角;将所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角更新为所述补偿后的航向角。
该实现方式中,根据由车辆的定位信息和车载相机在世界坐标系中的y轴平移量建立的车辆的观测模型获取车辆当前时刻的航向角,使用车载相机在世界坐标系下的外部参数中的航向角对车辆当前时刻的航向角进行补偿,得到补偿后的航向角,并将车载相机的实际相机坐标系与世界坐标系之间的变换关系中的航向角更新为补偿后的航向角,使得在车辆不完全平行于车道线行驶时,能够补偿车辆航向角的影响,提高了车载相机的实际相机坐标系与世界坐标系之间的变换关系的准确度。
第二方面,本申请提供一种车载相机的外部参数的标定装置,所述装置可以包括用于实现第一方面中的方法的各个模块,这些模块可以通过软件和/或硬件的方式实现。
第三方面,本申请提供一种车载相机的外部参数的标定装置。该装置可以包括与存储器耦合的处理器。其中,该存储器用于存储程序代码,该处理器用于执行该存储器中的程序代码,以实现第一方面或其中任意一种实现方式中的方法。
可选地,该装置还可以包括该存储器。
第四方面,本申请提供一种芯片,包括至少一个处理器和通信接口,所述通信接口和所述至少一个处理器通过线路互联,所述至少一个处理器用于运行计算机程序或指令,以执行如第一方面或其中任意一种可能的实现方式所述的方法。
第五方面,本申请提供一种计算机可读介质,该计算机可读介质存储用于设备执行的程序代码,该程序代码包括用于执行如第一方面或其中任意一种可能的实现方式所述的方法。
第六方面,本申请提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行如第一方面或其中任意一种可能的实现方式所述的方法。
第七方面,本申请提供一种计算设备,包括至少一个处理器和通信接口,所述通信接口和所述至少一个处理器通过线路互联,所述通信接口与目标系统通信,所述至少一个处理器用于运行计算机程序或指令,以执行如第一方面或其中任意一种可能的实现方式所述的方法。
第八方面,本申请提供一种计算系统,包括至少一个处理器和通信接口,所述通信接口和所述至少一个处理器通过线路互联,所述通信接口与目标系统通信,所述至少一个处理器用于运行计算机程序或指令,以执行如第一方面或其中任意一种可能的实现方式所述的方法。
附图说明
图1为一种像素坐标系的示意图;
图2为一种相机坐标系的示意图;
图3为本申请的实施例的一种实际相机坐标系和虚拟理想相机坐标系的示意图;
图4为本申请的实施例的一种世界坐标系的示意图;
图5为本申请的实施例的一种应用场景的示意图;
图6为本申请的实施例的一种车载相机的外部参数的标定方法的流程示意图;
图7为本申请的实施例的一种第一图像的示意图;
图8为本申请的实施例的一种鸟瞰图的示意图;
图9为本申请的实施例的一种航向角补偿的示意图;
图10为本申请的实施例的另一种车载相机的外部参数的标定方法的流程示意图;
图11为本申请一个实施例的一种车载相机的外部参数的标定装置的示意性结构图;
图12为本申请另一个实施例的一种车载相机的外部参数的标定装置的示意性结构图。
具体实施方式
下面将结合本申请的实施例中的附图,对本申请实施例中的技术方案进行描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
图1为一种像素坐标系的示意图。如图1所示,像素坐标系的左上角的顶点为原点O p,水平向右为u轴,垂直向下为v轴。
像素坐标是指像素在图像中的位置。在像素坐标系中,任意一个像素点的坐标可以表示为(u i,v i)。像素的表示方法不能反应图像中物体的物理尺寸。
图2为一种相机坐标系的示意图。如图2所示,相机坐标系是以相机的光轴作为Z c轴,光线在相机光学系统的中心位置就是原点O c,实际上就是透镜的中心。相机坐标系的水平轴X c与垂直轴Y c分别与像素坐标系的u轴和v轴平行。
图3为本申请的实施例的一种实际相机坐标系和虚拟理想相机坐标系的示意图。如图3所示,L1、L2和L3为位于汽车行驶路面上的三条平行线,根据图2所示的相机坐标系的原点、x轴、y轴和z轴的选取规则和位置关系,选取相机透镜的中心作为实际相机坐标系的原点O r;选取相机的光轴作为实际相机坐标系的Z r,Z r与汽车行驶路面平行,相机的前方为Z r的正方向;选取与Z r垂直且与汽车行驶路面平行的方向为实际相机坐标系的X r,选取L2至L3的方向为X r的正方向,选取与汽车行驶路面垂直的方向为Y r(图中未画出),示例性的,选取垂直于汽车行驶路面向里的方向为Y r的正方向。
虚拟理想相机坐标系的原点与实际相机坐标系的原点O r重合,虚拟理想相机坐标系的y轴与Y r重合,且方向一致。将实际相机坐标系的Z r和X r绕Y r旋转β角度,直到X r与汽车行驶路面上的三条平行线垂直时,得到虚拟理想相机坐标系的z轴方向和x轴方向。
图4为本申请的实施例的一种世界坐标系的示意图。如图4所示,L1、L2和L3为位于汽车行驶路面上的三条平行线,ZOX为相机的虚拟理想相机坐标系,选取L2上的任意一点作为世界坐标系的原点O w,世界坐标系的X w的方向与相机的虚拟理想相机坐标系的Z轴方向一致,世界坐标系的Y w与L2垂直且与汽车行驶路面平行,选 取L2至L1的方向为Y w的正方向。
图5为本申请的实施例的一种应用场景的示意图。图5所示的场景是利用车辆行驶路面上的三条平行已知间距的平行线L1、L2和L3对相机的外部参数进行标定的场景。通过使车辆从三条平行线的一端行驶至另一端,完成对相机的外部参数的标定。其中,行驶路面上的三条平行已知间距的平行线L1、L2和L3可以为特定场地画出的车道线,也可以是正常道路上的车道线。
可以理解的是,图5所示的场景仅是一种示例,本申请的技术方案还可以应用于其它场景,只要该场景涉及对相机的外部参数的标定即可。例如,本申请的技术方案还可以应用于智能机器人中的相机的外部参数的标定等场景。
图6为本申请的实施例的一种车载相机的外部参数的标定方法的流程示意图。如图6所示,该方法至少包括S601至S603。
S601,获取第一图像中的M条第一平行线在车载相机的实际相机坐标系中的坐标,第一图像为车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,目标道路上包含平行的N条车道线,N条车道线中的任意两条车道线间的距离为已知的,M条第一平行线与N条车道线中的M条第一车道线一一对应,M为大于或等于3的整数,N为大于或等于M的整数。
在一种可能的实现方式中,车载相机的所属车辆在目标道路上的行驶过程中,车载相机对目标道路上的N条车道线中的M条车道线进行拍摄,得到第一图像。第一图像中包含M条第一平行线,该M条第一平行线由车载相机对目标道路上的N条车道线中的M条车道线的拍摄得到,第一图像中的M条第一平行线与该目标道路上的N条车道线中的M条车道线一一对应。
示例性的,图7为本申请的实施例的一种第一图像的示意图。如图7所示,该第一图像中有3条第一平行线L1、L2和L3,分别与目标道路中的N条车道线中的3条车道线对应,此时M等于3,N大于或等于3。
提取第一图像中的M条第一平行线,得到该M条第一平行线在像素坐标系中的坐标,并将M条平行线在像素坐标系中的坐标转换为M条平行线在车载相机的相机坐标系中的坐标。其中,像素坐标系的一种示例可以为图1所示的像素坐标系,相机坐标系的另一种示例可以图3所示的实际相机的相机坐标系。
作为一种示例,通过分割算法提取第一图像中的平行线区域和轮廓,再进行亚像素级别的边缘提取,获得M条第一平行线。例如,分割算法包括分水岭算法等。
作为另一种示例,通过霍夫变换提取第一图像中的直线,再采用聚类和过滤等方法,获得第一图像中的M条第一平行线。其中,霍夫变换是图像处理中的一种特征提取技术,它可以通过投票算法检测具有特定形状的物体;聚类方法利用把相近斜率和截距的直线段进行聚类;过滤方法是通过相机参考安装角度和位置设定感兴趣区域进行筛选。
S602,根据M条第一平行线在车载相机的实际相机坐标系中的坐标和第一图像的鸟瞰图中M条第二平行线的约束,确定车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系,M条第二平行线与所述M条第一平行线一一对应,M条第二平行线在鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约 束。
车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系为车载相机的虚拟理想相机坐标系相对于车载相机的实际相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离。其中,车载相机的实际相机坐标系的一种示例可以为图3所示的实际相机坐标系,车载相机的虚拟理想相机坐标系的一种示例可以为图3所示的虚拟理想相机坐标系。
在一种可能的实现方式中,根据M条第一平行线在车载相机的实际相机坐标系中的坐标,采用逆透视变换法,得到M条第一平行线的鸟瞰图。鸟瞰图中有M条第二平行线,M条第二平行线与M条第一平行线一一对应。
示例性的,图8为本申请的实施例的一种鸟瞰图的示意图。如图8所示,鸟瞰图中有3条第二平行线,M等于3,分别为L1、L2和L3,其中,θ 1为第二平行线L1与虚拟理想坐标系的x轴方向的夹角,θ 2为第二平行线L2与虚拟理想坐标系的x轴方向的夹角,θ 3为第二平行线L3与虚拟理想坐标系的x轴方向的夹角,ω 12为第二平行线L1与第二平行线L2之间的距离,ω 23为第二平行线L2与第二平行线L3之间的距离。
平行约束包括M条第二平行线平行,竖直约束包括M条第二平行线中的任意一条第二平行线与虚拟理想相机坐标系的x轴方向垂直,间距比例约束包括M条第二平行线中的任意两个第二平行线之间的距离比例与任意两个第二平行线对应的M条车道线之间的距离比例相同,距离约束包括任意两个第二平行线之间的距离与所述任意两个第二平行线对应的M条车道线之间的距离的差值最小。
在一种可能的实现方式中,采用虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离的优化顺序,依次优化车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系。
根据M条第一平行线在车载相机的实际相机坐标系中的坐标和平行约束,通过优化等方法确定虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角。
示例性的,平行约束包括使得损失函数0.5∑(θ ii+1) 2最小,θ i为所述M条第二平行线中第i条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,θ i+1为所述M条第二平行线中的第i+1条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,i为大于或等于1且小于M的整数,θ i小于或等于90度,θ i+1小于或等于90度。
根据M条第一平行线在车载相机的实际相机坐标系中的坐标和竖直约束,通过优化等方法确定虚拟理想相机坐标系相对于所述实际相机坐标系的y轴旋转角。
示例性的,竖直约束包括使得损失函数0.5∑(θ j-90) 2最小,θ j为所述M条第二平行线中第j条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,j为大于或等于1且小于或等于M的整数,θ j小于或等于90度。
根据M条第一平行线在车载相机的实际相机坐标系中的坐标和间距比例约束,通过优化等方法确定虚拟理想相机坐标系相对于实际相机坐标系的z轴旋转角。
示例性的,间距比例约束包括使得损失函数
Figure PCTCN2021115802-appb-000003
最小,其中,ω s,s+1为M条第二平行线中的第s条第二平行线与第s+1条第二平行线之间的距离, ω s+1,s+2为M条第二平行线中的第s+1条第二平行线与第s+2条第二平行线之间的距离,W s,s+1为M条第二平行线中第s条第二平行线对应的车道线与第s+1条第二平行线对应的车道线之间的实际距离,W s+1,s+2为M条第二平行线中第s+1条第二平行线对应的车道线与第s+2条第二平行线对应的车道线之间的实际距离,s为大于或等于1且小于M-1的整数。
根据M条第一平行线在车载相机的实际相机坐标系中的坐标和间距约束,通过优化等方法确定虚拟理想相机坐标系相对于实际相机坐标系的z轴平移距离。
示例性的,所述间距约束还包括0.5∑(ω i,i+1-W i,i+1) 2最小,ω i,i+1为所述M条第二平行线中的第i条第二平行线与第i+1条第二平行线之间的距离,W i,i+1为所述M条第二平行线中第i条第二平行线对应的车道线与第i+1条第二平行线对应的车道线之间的实际距离,i为大于或等于1且小于M的整数。
使用该实现方式提供的虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离的优化顺序,能够避免欧拉角的耦合引入的误差,提高了车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系的准确度。
在另一种可能的实现方式中,M条第二平行线在鸟瞰图中的约束包括使得损失函数α∑(θ n-90) 2+β∑(ω m,m+1-W m,m+1) 2最小,其中α和β为权重参数,θ n为M条第二平行线中第n条第二平行线与虚拟理想相机坐标系的x轴方向的夹角,ω m,m+1为M条第二平行线中的第m条第二平行线与第m+1条第二平行线之间的距离,W m,m+1为M条第二平行线中第m条第二平行线对应的车道线与第m+1条第二平行线对应的车道线之间的实际距离,n为大于或等于1且小于或等于M的整数,m为大于或等于1且小于M的整数。
采用该实现方式提供的约束方法,可以结合优化等方法同时得到虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离,但容易由于初值选取不当,而只得到车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系的局部优化值,而非全局最优值。
S603,根据车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系以及虚拟理想相机坐标系与世界坐标系之间的变换关系,确定车载相机的实际相机坐标系与世界坐标系之间的变换关系。
虚拟理想相机坐标系与世界坐标系之间的变换关系为世界坐标系相对于车载相机的虚拟理想相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离。车载相机的实际相机坐标系与世界坐标系之间的变换关系即为车载相机的外参,包括世界坐标系相对于车载相机的实际相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离,也可以称为翻滚角(roll)、俯仰角(pitch)、偏航角(yaw)和平移距离。其中,车载相机的实际相机坐标系的一种示例可以为图3所示的车载相机的实际相机坐标系,车载相机的虚拟理想相机坐标系的一种示例可以为图3或图4所示的车载相机的虚拟理想相机坐标系,世界坐标系的一种示例可以为图4所示的世界坐标系。
在一种可能的实现方式中,通过转换矩阵
Figure PCTCN2021115802-appb-000004
将车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系转换为虚拟理想相机坐标系与世界坐标系之间的变换关系,得到车载相机的实际相机坐标系与世界坐标系之间的变换关系,即车载相机的外参。
在得到车载相机的实际相机坐标系与世界坐标系之间的变换关系之后,获取车载相机在世界坐标系中的y轴平移量;并从车载相机的所属车辆的传感器中获取车辆的定位信息,其中,车载相机的所属车辆的传感器包括轮速计、惯性传感器(inertial measurement unit,IMU)和全球定位系统(global positioning system,GPS)等,相应地,车辆的定位信息包括车辆当前的位置信息和/或车辆的加速度和/或车辆的轮速等。根据车辆的定位信息和车载相机在世界坐标系中的y轴平移量建立车辆的观测模型;并根据车辆的观测模型获取车辆当前时刻的航向角;再根据得到的车载相机的外部参数中的航向角对车辆当前时刻的航向角进行补偿,得到补偿后的航向角;将车载相机的外参中的航向角更新为该补偿后的航向角。
示例性的,图9为本申请的实施例的一种航向角补偿的示意图,如图9所示,车辆航向角为根据车辆的观测模型获取的车辆当前时刻的航向角,相机航向角为确定的车载相机的外参中的航向角,将车辆航向角减去相机航向角,得到相机到车辆的航向角,也可以称为补偿后的航向角,并将确定的车载相机中的航向角更新为补偿后的航向角,补偿了车辆定位得到的车辆航向角的影响。
在另一种可能的实现方式中,缓存车载相机的多个完成航向角补偿后的外部参数,对该车载相机的多个完成航向角补偿后的外部参数进行滤波,得到滤波后的车载相机的外部参数,将车载相机的外部参数更新为滤波后的车载相机的外部参数。
示例性的,缓存车载相机的多个完成航向角补偿后的外部参数,进行周期性最优估计。若车辆行驶出标定场地时,缓存的完成航向角补偿后的车载相机的外部参数不超过预设阈值,则缓存的完成航向角补偿后的车载相机的外部参数都用于进行最优估计,并完成标定;若车辆行驶出标定场地时,缓存的完成航向角补偿后的车载相机的外部参数超过预设阈值,则选取每间隔一定数量(如10帧)缓存的完成航向角补偿后的车载相机的外部参数进行最优估计。其中,标定场地为用于车载相机的外参标定的目标道路中的N条车道线所在的区域。
优选地,对完成航向角补偿后的车载相机的外部参数中的每一类型的外部参数分别缓存一组数据,即将完成航向角补偿后的车载相机的外部参数中的翻滚角、俯仰角、航向角和平移距离分别缓存一组数据,采用核密度估计法,获取每组数据中概率最大的外参值,即为该类型外参的最优估计。
具体地,核密度估计法是对数据统计直方图的一个高斯函数的拟合,找到一个与已知数据的直方图拟合效果最好的高斯函数,能够抵抗一定量的噪声干扰。对于每一类型的外参,通过每种类型对应的一组外参值,估计出拟合的核函数(优选高斯核函数),找到核函数的峰值,对应的横坐标值(高斯函数概率最大处)就是每种类型外参的最优值。
可选地,采用四维核密度估计法可以同时估计出四种类型的外部参数中每种类型外参的最优值。
可选地,针对每种类型的外参分别缓存一组数据,采用中值滤波法,得到每种类型外参的中位数,即为每种类型外参的最优值。
可选地,针对每种类型的外参分别缓存一组数据,采用均值滤波法,计算每组数据的平均值,得到的每组数据的均值即为每种类型外参的最优值。
直到得到的每种类型的外参的最优值的标准差小于特定阈值时,则标定结果收敛,停止标定,得到车载相机的外部参数。
本申请提供的技术方案中,根据车载相机获取的第一图像中的M条第一平行线在车载相机的实际相机坐标系中的坐标和第一图像的鸟瞰图中M条第二平行线的平行约束、竖直约束、间距比例约束和间距约束,按照虚拟理想相机坐标系相对于实际相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离的优化顺序,确定车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系;再根据车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系以及虚拟理想相机坐标系与世界坐标系之间的变换关系,确定车载相机的实际相机坐标系与世界坐标系之间的变换关系,提高了车载相机的外参标定方法的灵活性,同时提高了车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系(即车载相机的外部参数)的准确度。
图10为本申请的实施例的另一种车载相机的外部参数的标定方法的流程示意图。如图10所示,该方法至少包括S1001至S1011。
S1001,获取第一图像中的M条第一平行线在车载相机的实际相机坐标系中的坐标,第一图像为车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,目标道路上包含平行的N条车道线,N条车道线中的任意两条车道线间的距离为已知的,M条第一平行线与N条车道线中的M条第一车道线一一对应,M为大于或等于3的整数,N为大于或等于M的整数。
S1002,根据M条第一平行线在车载相机的实际相机坐标系中的坐标和第一图像的鸟瞰图中M条第二平行线的约束,确定车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系,M条第二平行线与所述M条第一平行线一一对应,M条第二平行线在鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约束。
需要说明的是,S1001至S1002可以参考S601至S602,此处不再进行赘述。
S1003,根据车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系以及虚拟理想相机坐标系与世界坐标系之间的变换关系,确定车载相机的第一外部参数。
虚拟理想相机坐标系与世界坐标系之间的变换关系为世界坐标系相对于车载相机的虚拟理想相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离。车载相机的实际相机坐标系与世界坐标系之间的变换关系即为车载相机的外参,包括世界坐标系相对于车载相机的实际相机坐标系的x轴旋转角度、y轴旋转角度、z轴旋转角度和z轴平移距离,也可以称为翻滚角(roll)、俯仰角(pitch)、偏航角(yaw) 和平移距离。其中,车载相机的实际相机坐标系的一种示例可以为图3所示的车载相机的实际相机坐标系,车载相机的虚拟理想相机坐标系的一种示例可以为图3或图4所示的车载相机的虚拟理想相机坐标系,世界坐标系的一种示例可以为图4所示的世界坐标系。
在一种可能的实现方式中,通过转换矩阵
Figure PCTCN2021115802-appb-000005
将车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系转换为虚拟理想相机坐标系与世界坐标系之间的变换关系,得到车载相机的实际相机坐标系与世界坐标系之间的第一变换关系,即车载相机的第一外部参数。
S1004,根据车载相机的实际相机坐标系与车载相机的虚拟理想相机坐标系之间的变换关系以及虚拟理想相机坐标系与世界坐标系之间的变换关系,获取车载相机在世界坐标系中的y轴平移量。
作为一种示例,在如图4所示的世界坐标系中,车载相机在世界坐标系的Y w轴上的位置即为车载相机在世界坐标系中的y轴平移量。
S1005,根据车载相机的所属车辆的定位信息和车载相机在世界坐标系中的y轴平移量,建立该车辆的观测模型,该观测模型包含了y轴平移量对状态量的影响。
在一种可能的实现方式中,从车载相机的所属车辆的传感器中获取车辆的定位信息,其中,车载相机的所属车辆的传感器包括轮速计、IMU和GPS等,相应地,车辆的定位信息包括车辆当前的位置信息和/或车辆的加速度和/或车辆的轮速等。
S1006,根据车辆的观测模型获取车辆当前时刻的航向角。
在一种可能的实现方式中,对车辆的观测模型进行滤波更新,并获取车辆当前时刻的航向角。
S1007,根据车载相机的第一外部参数中的航向角对车辆当前时刻的航向角进行补偿,得到补偿后的航向角。
在一种可能的实现方式中,如图9所示的航向角补偿的示意图,车辆航向角为根据车辆的观测模型获取的车辆当前时刻的航向角,相机航向角为确定的车载相机的外参中的航向角,将车辆航向角减去相机航向角,得到相机到车辆的航向角,也可以称为补偿后的航向角,并将确定的车载相机中的航向角更新为补偿后的航向角,补偿了车辆定位得到的车辆航向角的影响。
S1008,将车载相机的第一外部参数中的航向角更新为补偿后的航向角。
车载相机的第一外部参数中包括第一翻滚角、第一俯仰角、第一航向角和第一平移距离。将车载相机的第一外部参数中的第一航向角更新为补偿后的航向角,更新后车载相机的第一外部参数包括第一翻滚角、第一俯仰角、补偿后的航向角和第一平移距离。
S1009,缓存车载相机的多个第一外部参数。
在一种可能的实现方式中,在车载相机的外部参数的标定过程中,车载相机的所属车辆从驶入标定场地到驶出标定场地期间,车载相机拍摄得到多张第一图像,并根据多张第一图像得到车载相机的多个第一外部参数,对车载相机的多个第一外部参数 进行缓存。标定场地为用于车载相机的外参标定的目标道路中的N条车道线所在的区域。
S1010,对缓存的车载相机的多个第一外部参数进行滤波,得到车载相机的第二外部参数。
在一种可能的实现方式中,对缓存的车载相机的多个第一外部参数进行周期性最优估计。
可选地,若车载相机的所属车辆从驶入标定场地到驶出标定场地期间,缓存的车载相机的多个第一外部参数的数量不超过预设阈值,则缓存的车载相机的多个第一外部参数都用于进行最优估计,并完成标定;若车载相机的所属车辆从驶入标定场地到驶出标定场地期间,缓存的车载相机的多个第一外部参数的数量超过预设阈值,则选取每间隔一定数量(如10帧)缓存的车载相机的第一外部参数进行最优估计。
优选地,对车载相机的多个第一外部参数中的每种类型的外部参数分别缓存一组数据,即将车载相机的多个第一外部参数中的翻滚角、俯仰角、航向角和平移距离分别缓存一组数据,采用核密度估计法,获取每组数据中概率最大的外参值,即为该类型外参的最优估计。
具体地,核密度估计法是对数据统计直方图的一个高斯函数的拟合,找到一个与已知数据的直方图拟合效果最好的高斯函数,能够抵抗一定量的噪声干扰。对于每一类型的外参,通过每种类型对应的一组外参值,估计出拟合的核函数(优选高斯核函数),找到核函数的峰值,对应的横坐标值(高斯函数概率最大处)就是每种类型外参的最优值。
可选地,采用四维核密度估计法可以同时估计出四种类型的外部参数中每种类型外参的最优值。
可选地,针对每种类型的外参分别缓存一组数据,采用中值滤波法,得到每种类型外参的中位数,即为每种类型外参的最优值。
可选地,针对每种类型的外参分别缓存一组数据,采用均值滤波法,计算每组数据的平均值,得到的每组数据的均值即为每种类型外参的最优值。
直到得到的每种类型的外参的最优值的标准差小于特定阈值时,则标定结果收敛,停止标定,得到车载相机的第二外部参数。
S1011,将所车载相机的外部参数更新为第二外部参数。
本申请提供的技术方案中,使用车载相机在世界坐标系下的外部参数中的航向角对车辆当前时刻的航向角进行补偿,得到补偿后的航向角,并将车载相机的实际相机坐标系与世界坐标系之间的变换关系中的航向角更新为补偿后的航向角,使得在车辆不完全平行于车道线行驶时,能够补偿车辆航向角的影响,提高了车载相机的实际相机坐标系与世界坐标系之间的变换关系的准确度。
图11为本申请一个实施例的一种车载相机的外部参数的标定装置的示意性结构图。如图11所示,装置1100可以包括获取模块1101和处理模块1102。装置1100可以用于实现上述任意一个实施例所示的方法。
在一种可能的实现方式中,装置1100可以用于实现上述图6所示的方法。例如,获取模块1101用于实现S601,处理模块1102用于实现S602和S603。
在另一种可能的实现方式中,装置1100还包括补偿模块、更新模块和缓存模块。该实现方式中的装置1100可以用于实现上述图10所示的方法。例如,获取模块1101用于实现S1001、S1004和S1006,处理模块1102用于实现S1002、S1003、S1005和S1010,补偿模块用于实现S1007,更新模块用于实现S1008和S1011,缓存模块用于实现S1009。
图12为本申请另一个实施例的一种车载相机的外部参数的标定装置的示意性结构图。图12所示的装置1200可以用于执行上述任意一个实施例所示的车载相机的外部参数的标定方法。
如图12所示,本实施例的装置1200包括:存储器1201、处理器1202、通信接口1203以及总线1204。其中,存储器1201、处理器1202、通信接口1203通过总线1204实现彼此之间的通信连接。
存储器1201可以是只读存储器(read only memory,ROM),静态存储设备,动态存储设备或者随机存取存储器(random access memory,RAM)。存储器1201可以存储程序,当存储器1201中存储的程序被处理器1202执行时,处理器1202可以用于执行图6和图10所示的方法的各个步骤。
处理器1202可以采用通用的中央处理器(central processing unit,CPU),微处理器,应用专用集成电路(application specific integrated circuit,ASIC),或者一个或多个集成电路,用于执行相关程序,以实现本申请方法实施例的车载相机的外部参数的标定方法。
处理器1202还可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,本申请各个实施例的方法的各个步骤可以通过处理器1202中的硬件的集成逻辑电路或者软件形式的指令完成。
上述处理器1202还可以是通用处理器、数字信号处理器(digital signal processing,DSP)、专用集成电路(ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器1201,处理器1202读取存储器1201中的信息,结合其硬件完成本申请实施例中各个方法所需执行的功能,例如,可以执行图6和图10所示实施例的各个步骤/功能。
通信接口1203可以使用但不限于收发器一类的收发装置,来实现装置1200与其他设备或通信网络之间的通信。
总线1204可以包括在装置1200各个部件(例如,存储器1201、处理器1202、通信接口1203)之间传送信息的通路。
应理解,本申请实施例所示的装置1200可以是电子设备,或者,也可以是配置于电子设备中的芯片。
应理解,本申请实施例中的处理器可以为中央处理单元(central processing unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
还应理解,本申请实施例中的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的随机存取存储器(random access memory,RAM)可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
上述实施例,可以全部或部分地通过软件、硬件、固件或其他任意组合来实现。当使用软件实现时,上述实施例可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令或计算机程序。在计算机上加载或执行所述计算机指令或计算机程序时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以为通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集合的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质。半导体介质可以是固态硬盘。
应理解,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,其中A,B可以是单数或者复数。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系,但也可能表示的是一种“和/或”的关系,具体可参考前后文进行理解。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (17)

  1. 一种车载相机的外部参数的标定方法,其特征在于,所述方法包括:
    获取第一图像中的M条第一平行线在所述车载相机的实际相机坐标系中的坐标,所述第一图像为所述车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,所述目标道路上包含平行的N条车道线,所述N条车道线中的任意两条车道线之间的距离是已知的,所述M条第一平行线与所述N条车道线中的M条第一车道线一一对应,M为大于或等于3的整数,N为大于或等于M的整数;
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,所述M条第二平行线与所述M条第一平行线一一对应,所述M条第二平行线在所述鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约束,所述平行约束包括所述M条第二平行线平行,所述竖直约束包括所述M条第二平行线中的任意一条第二平行线与所述虚拟理想相机坐标系的x轴方向垂直,所述间距比例约束包括所述M条第二平行线中的任意两个第二平行线之间的距离比例与所述任意两个第二平行线对应的M条车道线之间的距离比例相同,所述距离约束包括所述任意两个第二平行线之间的距离与所述任意两个第二平行线对应的M条车道线之间的距离的差值最小;
    根据所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系以及所述虚拟理想相机坐标系与世界坐标系之间的变换关系,确定所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,包括:
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述平行约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的x轴旋转角;
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述竖直约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的y轴旋转角;
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距比例约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴旋转角;
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴平移距离。
  3. 根据权利要求1或2所述的方法,其特征在于,所述平行约束还包括:0.5Σ(θ ii+1) 2最小,其中,θ i为所述M条第二平行线中第i条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,θ i+1为所述M条第二平行线中的第i+1条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,i为大于或等于1且小于M的整数,θ i小于或等于90度,θ i+1小于或等于90度。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,所述竖直约束还包括: 0.5Σ(θ j-90) 2最小,θ j为所述M条第二平行线中第j条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,j为大于或等于1且小于或等于M的整数,θ j小于或等于90度。
  5. 根据权利要求1至4中任一项所述的方法,其特征在于,所述间距比例约束还包括:
    Figure PCTCN2021115802-appb-100001
    最小,其中,ω s,s+1为所述M条第二平行线中的第s条第二平行线与第s+1条第二平行线之间的距离,ω s+1,s+2为所述M条第二平行线中的第s+1条第二平行线与第s+2条第二平行线之间的距离,W s,s+1为所述M条第二平行线中第s条第二平行线对应的车道线与第s+1条第二平行线对应的车道线之间的实际距离,W s+1,s+2为所述M条第二平行线中第s+1条第二平行线对应的车道线与第s+2条第二平行线对应的车道线之间的实际距离,s为大于或等于1且小于M-1的整数。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述间距约束还包括0.5Σ(ω i,i+1-W i,i+1) 2最小,ω i,i+1为所述M条第二平行线中的第i条第二平行线与第i+1条第二平行线之间的距离,W i,i+1为所述M条第二平行线中第i条第二平行线对应的车道线与第i+1条第二平行线对应的车道线之间的实际距离,i为大于或等于1且小于M的整数。
  7. 根据权利要求1至6中任一项所述的方法,其特征在于,所述方法还包括:
    获取所述车载相机在所述世界坐标系中的y轴平移量;
    获取车辆的定位信息,所述车辆的定位信息包括所述车辆当前的位置信息和/或所述车辆的加速度和/或所述车辆的轮速;
    根据所述车辆的定位信息和所述车载相机在所述世界坐标系中的y轴平移量建立所述车辆的观测模型;
    根据所述车辆的观测模型获取所述车辆当前时刻的航向角;
    根据所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角对所述车辆当前时刻的航向角进行补偿,得到补偿后的航向角;
    将所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角更新为所述补偿后的航向角。
  8. 一种车载相机的外部参数的标定装置,其特征在于,所述装置包括:
    获取模块,用于获取第一图像中的M条第一平行线在所述车载相机的实际相机坐标系中的坐标,所述第一图像为所述车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,所述目标道路上包含平行的N条车道线,所述N条车道线中的任意两条车道线之间的距离是已知的,所述M条第一平行线与所述N条车道线中的M条第一车道线一一对应,M为大于或等于3的整数,N为大于或等于M的整数;
    处理模块,用于根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,所述M条第二平行线与所述M条第一平行线一一对应,所述M条第二平行线在所述鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约束,所述平行约束包括所述M条第二平行线平行,所述竖直约束包括所述M条第二平行线中的任意一条第二平行线与所述 虚拟理想相机坐标系的x轴方向垂直,所述间距比例约束包括所述M条第二平行线中的任意两个第二平行线之间的距离比例与所述任意两个第二平行线对应的M条车道线之间的距离比例相同,所述距离约束包括所述任意两个第二平行线之间的距离与所述任意两个第二平行线对应的M条车道线之间的距离的差值最小;
    所述处理模块,还用于根据所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系以及所述虚拟理想相机坐标系与世界坐标系之间的变换关系,确定所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系。
  9. 根据权利要求8所述的装置,其特征在于,所述处理模块具体用于:
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述平行约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的x轴旋转角;
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述竖直约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的y轴旋转角;
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距比例约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴旋转角;
    根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴平移距离。
  10. 根据权利要求8或9所述的装置,其特征在于,所述平行约束还包括:0.5Σ(θ ii+1) 2最小,其中,θ i为所述M条第二平行线中第i条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,θ i+1为所述M条第二平行线中的第i+1条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,i为大于或等于1且小于M的整数,θ i小于或等于90度,θ i+1小于或等于90度。
  11. 根据权利要求8至10中任一项所述的装置,其特征在于,所述竖直约束还包括:0.5Σ(θ j-90) 2最小,θ j为所述M条第二平行线中第j条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,j为大于或等于1且小于或等于M的整数,θ j小于或等于90度。
  12. 根据权利要求8至11中任一项所述的装置,其特征在于,所述间距比例约束还包括:
    Figure PCTCN2021115802-appb-100002
    最小,其中,ω s,s+1为所述M条第二平行线中的第s条第二平行线与第s+1条第二平行线之间的距离,ω s+1,s+2为所述M条第二平行线中的第s+1条第二平行线与第s+2条第二平行线之间的距离,W s,s+1为所述M条第二平行线中第s条第二平行线对应的车道线与第s+1条第二平行线对应的车道线之间的实际距离,W s+1,s+2为所述M条第二平行线中第s+1条第二平行线对应的车道线与第s+2条第二平行线对应的车道线之间的实际距离,s为大于或等于1且小于M-1的整数。
  13. 根据权利要求8至12中任一项所述的装置,其特征在于,所述间距约束还包括0.5Σ(ω i,i+1-W i,i+1) 2最小,ω i,i+1为所述M条第二平行线中的第i条第二平行线与第i+1条第二平行线之间的距离,W i,i+1为所述M条第二平行线中第i条第二平行线对应的车道线与第i+1条第二平行线对应的车道线之间的实际距离,i为大于或等于1且小于M的整数。
  14. 根据权利要求8至13中任一项所述的装置,其特征在于,所述装置还包括补 偿模块,所述补偿模块用于:
    获取所述车载相机在所述世界坐标系中的y轴平移量;
    获取车辆的定位信息,所述车辆的定位信息包括所述车辆当前的位置信息和/或所述车辆的加速度和/或所述车辆的轮速;
    根据所述车辆的定位信息和所述车载相机在所述世界坐标系中的y轴平移量建立所述车辆的观测模型;
    根据所述车辆的观测模型获取所述车辆当前时刻的航向角;
    根据所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角对所述车辆当前时刻的航向角进行补偿,得到补偿后的航向角;
    将所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角更新为所述补偿后的航向角。
  15. 一种车载相机的外部参数的标定装置,其特征在于,包括:存储器和处理器;
    所述存储器用于存储程序指令;
    所述处理器用于调用所述存储器中的程序指令执行如权利要求1至7中任一项所述的方法。
  16. 一种芯片,其特征在于,包括至少一个处理器和通信接口,所述通信接口和所述至少一个处理器通过线路互联,所述至少一个处理器用于运行计算机程序或指令,以执行如权利要求1至7中任一项所述的方法。
  17. 一种计算机可读介质,其特征在于,所述计算机可读介质存储用于计算机执行的程序代码,该程序代码包括用于执行如权利要求1至7中任一项所述的方法的指令。
PCT/CN2021/115802 2021-08-31 2021-08-31 车载相机的外部参数的标定方法及相关装置 WO2023028880A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180006501.9A CN114730472A (zh) 2021-08-31 2021-08-31 车载相机的外部参数的标定方法及相关装置
PCT/CN2021/115802 WO2023028880A1 (zh) 2021-08-31 2021-08-31 车载相机的外部参数的标定方法及相关装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/115802 WO2023028880A1 (zh) 2021-08-31 2021-08-31 车载相机的外部参数的标定方法及相关装置

Publications (1)

Publication Number Publication Date
WO2023028880A1 true WO2023028880A1 (zh) 2023-03-09

Family

ID=82235994

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/115802 WO2023028880A1 (zh) 2021-08-31 2021-08-31 车载相机的外部参数的标定方法及相关装置

Country Status (2)

Country Link
CN (1) CN114730472A (zh)
WO (1) WO2023028880A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012508A (zh) * 2023-03-28 2023-04-25 高德软件有限公司 车道线的渲染方法、装置、存储介质及程序产品
CN116704040A (zh) * 2023-04-03 2023-09-05 上海保隆汽车科技(武汉)有限公司 相机标定方法、装置、控制器、车辆及存储介质
CN116934847A (zh) * 2023-09-15 2023-10-24 蓝思系统集成有限公司 卸料方法、装置、电子设备及存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116109698B (zh) * 2023-04-11 2023-07-14 禾多科技(北京)有限公司 目标虚拟车位坐标值的确定方法、装置及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651963A (zh) * 2016-12-29 2017-05-10 清华大学苏州汽车研究院(吴江) 一种用于驾驶辅助系统的车载摄像头的安装参数标定方法
CN112184830A (zh) * 2020-09-22 2021-01-05 深研人工智能技术(深圳)有限公司 相机内参和外参标定方法、装置、计算机设备及存储介质
US20210027496A1 (en) * 2018-05-23 2021-01-28 Panasonic Intellectual Property Management Co., Ltd. Calibration apparatus and calibration method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651963A (zh) * 2016-12-29 2017-05-10 清华大学苏州汽车研究院(吴江) 一种用于驾驶辅助系统的车载摄像头的安装参数标定方法
US20210027496A1 (en) * 2018-05-23 2021-01-28 Panasonic Intellectual Property Management Co., Ltd. Calibration apparatus and calibration method
CN112184830A (zh) * 2020-09-22 2021-01-05 深研人工智能技术(深圳)有限公司 相机内参和外参标定方法、装置、计算机设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TAO MA; ZHIZHENG LIU; GUOHANG YAN; YIKANG LI: "CRLF: Automatic Calibration and Refinement based on Line Feature for LiDAR and Camera in Road Scenes", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 8 March 2021 (2021-03-08), 201 Olin Library Cornell University Ithaca, NY 14853 , XP081907264 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012508A (zh) * 2023-03-28 2023-04-25 高德软件有限公司 车道线的渲染方法、装置、存储介质及程序产品
CN116012508B (zh) * 2023-03-28 2023-06-23 高德软件有限公司 车道线的渲染方法、装置及存储介质
CN116704040A (zh) * 2023-04-03 2023-09-05 上海保隆汽车科技(武汉)有限公司 相机标定方法、装置、控制器、车辆及存储介质
CN116704040B (zh) * 2023-04-03 2024-03-15 上海保隆汽车科技(武汉)有限公司 相机标定方法、装置、控制器、车辆及存储介质
CN116934847A (zh) * 2023-09-15 2023-10-24 蓝思系统集成有限公司 卸料方法、装置、电子设备及存储介质
CN116934847B (zh) * 2023-09-15 2024-01-05 蓝思系统集成有限公司 卸料方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN114730472A (zh) 2022-07-08

Similar Documents

Publication Publication Date Title
WO2023028880A1 (zh) 车载相机的外部参数的标定方法及相关装置
CN110163930B (zh) 车道线生成方法、装置、设备、系统及可读存储介质
CN110147382B (zh) 车道线更新方法、装置、设备、系统及可读存储介质
US9270891B2 (en) Estimation of panoramic camera orientation relative to a vehicle coordinate frame
WO2018177159A1 (zh) 运动物体的位置确定方法及系统
CN109849930B (zh) 自动驾驶汽车的相邻车辆的速度计算方法和装置
CN111295667B (zh) 图像立体匹配的方法和辅助驾驶装置
CN112017236B (zh) 一种基于单目相机计算目标物位置的方法及装置
EP3845927B1 (en) Merging multiple lidar point cloud data using an iterative closest point (icp) algorithm with weighting factor
CN113781562B (zh) 一种基于道路模型的车道线虚实配准和自车定位方法
CN114120149B (zh) 一种倾斜摄影测量建筑物特征点提取方法、装置、电子设备及介质
CN112132754B (zh) 一种车辆移动轨迹修正方法及相关装置
US20230222688A1 (en) Mobile device positioning method and positioning apparatus
WO2024016524A1 (zh) 基于独立非均匀增量采样的网联车位置估计方法及装置
CN114241062A (zh) 自动驾驶的相机外参确定方法、装置和计算机可读存储介质
CN112198878A (zh) 一种即时地图构建方法、装置、机器人及存储介质
WO2022199195A1 (zh) 地图更新方法、系统、车载终端、服务器及存储介质
CN114648639B (zh) 一种目标车辆的检测方法、系统及装置
WO2023184869A1 (zh) 室内停车场的语义地图构建及定位方法和装置
CN110827337B (zh) 确定车载相机的姿态的方法、装置和电子设备
WO2023283929A1 (zh) 双目相机外参标定的方法及装置
CN115713560A (zh) 一种摄像头和车辆的外参数标定方法及装置、电子设备、存储介质
CN113034538B (zh) 一种视觉惯导设备的位姿跟踪方法、装置及视觉惯导设备
CN113126117B (zh) 一种确定sfm地图绝对尺度的方法及电子设备
CN112634141B (zh) 一种车牌矫正方法、装置、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21955435

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE