WO2019184885A1 - 一种相机外参标定方法、装置及电子设备 - Google Patents

一种相机外参标定方法、装置及电子设备 Download PDF

Info

Publication number
WO2019184885A1
WO2019184885A1 PCT/CN2019/079569 CN2019079569W WO2019184885A1 WO 2019184885 A1 WO2019184885 A1 WO 2019184885A1 CN 2019079569 W CN2019079569 W CN 2019079569W WO 2019184885 A1 WO2019184885 A1 WO 2019184885A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
coordinate system
coordinates
intersection points
pixel
Prior art date
Application number
PCT/CN2019/079569
Other languages
English (en)
French (fr)
Inventor
冉盛辉
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Publication of WO2019184885A1 publication Critical patent/WO2019184885A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present application relates to the field of camera calibration technology, and in particular, to a camera external reference calibration method, device, and electronic device.
  • the camera external parameter is a set of parameters that characterize the position and rotation direction of the camera in the world coordinate system, including the camera's roll angle, pitch angle and yaw angle.
  • the camera external parameter calibration is the process of obtaining the camera external parameter. In the field of visual measurement, camera external reference calibration is a very critical link, and its calibration accuracy and stability will directly affect the accuracy of visual measurement results.
  • Visual measurement technology has a wide range of application scenarios, for example, it can be applied to product quality inspection, vehicle monitoring and other scenarios.
  • the corresponding camera external reference calibration method is: performing lane extraction from the real-time image, and then performing perspective transformation removal operation on the extracted lane, and then optimally extracting the result after the perspective transformation is removed. The value is iterated to get the camera's roll angle, pitch angle and yaw angle.
  • the above method is used to perform camera external calibration. Because it needs to be iteratively processed, the calculation process is unstable, the calculation takes a long time, the accuracy of the calculation result is low, and the calculation result may not converge, that is, the external parameters of the camera cannot be obtained. .
  • the application provides a camera external reference calibration method, device and electronic device to improve the speed of the camera external reference calibration and the accuracy of the calibration result.
  • the specific technical solutions are as follows:
  • an embodiment of the present application provides a camera external reference calibration method, where the method includes:
  • the original image of the calibration image is a picture obtained by shooting the scene with four straight lines preset by the camera, and the four straight lines are in the same place.
  • the closed area enclosed in the scene is a rectangle;
  • pixel coordinates of four intersection points formed by the four straight lines and pixel coordinates of the four intersection points are coordinates of the four intersection points in the primary point pixel coordinate system
  • the camera external parameters are calibrated by a preset coordinate transformation strategy.
  • the embodiment of the present application provides a camera external reference calibration device, where the device includes:
  • the distortion correction map acquisition module is configured to perform distortion correction on the original image of the calibration image based on the internal reference of the camera to obtain a distortion correction map, where the original image of the calibration image is obtained by photographing a scene with four straight lines preset by the camera. a picture, the closed area enclosed by the four straight lines in the scene is a rectangle;
  • a pixel coordinate acquiring module configured to acquire, in the distortion correction map, pixel coordinates of four intersection points formed by the four straight lines, where pixel coordinates of the four intersection points are the four intersection points in a primary point pixel coordinate system coordinate of;
  • the external parameter calibration module calibrates the camera external parameters by a preset coordinate transformation strategy based on the pixel coordinates of the four intersection points.
  • an embodiment of the present application provides an electronic device, including a processor and a memory;
  • a memory for storing a computer program
  • the processor when used to execute a program stored on the memory, implements the steps of any of the above camera external reference calibration methods.
  • the embodiment of the present application further provides a computer readable storage medium, where the computer readable storage medium stores instructions, when executed on a computer, causing the computer to execute any of the above camera external reference calibration methods. .
  • the distortion correction is performed on the original picture of the calibration picture based on the internal reference of the camera, and the distortion correction picture is obtained, wherein the original picture of the calibration picture is a picture obtained by shooting a scene with four lines preset in advance by the camera, four pieces.
  • the closed area enclosed by the straight line in the scene is a rectangle; in the distortion correction map, the pixel coordinates of the four intersection points formed by the four straight lines are obtained, wherein the pixel coordinates of the four intersection points are four intersection points in the pixel coordinate system of the main point Coordinates; based on the pixel coordinates of the four intersection points, the camera external parameters are calibrated by a preset coordinate transformation strategy.
  • FIG. 1 is a schematic flow chart of a method for calibrating a camera external reference according to an embodiment of the present application
  • FIG. 2 is a schematic view of four straight lines surrounded by a closed area drawn in a scene
  • Figure 3 is a schematic diagram of the geometric meaning represented by the camera external reference
  • FIG. 4 is a schematic flowchart of a method for calibrating a camera external reference according to another embodiment of the present application.
  • FIG. 5 is a schematic diagram of a designated vanishing point according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a designated vanishing point according to another embodiment of the present application.
  • FIG. 7 is a schematic diagram of a designated vanishing point according to still another embodiment of the present application.
  • Figure 8 is a schematic diagram of the intercept of the horizontal axis of the pixel coordinate system of the main point and the intersection of two specified straight lines;
  • FIG. 9 is a schematic diagram showing the relative position relationship between a camera local world coordinate system and a world coordinate system
  • FIG. 10 is a schematic structural diagram of a camera external reference calibration apparatus according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the embodiment of the present application provides a camera external reference calibration method, device, and electronic device.
  • the execution body of the camera external reference calibration method may be a camera including a core processing chip, or a control device independent of the camera.
  • a camera external reference method provided by an embodiment of the present application may specifically include the following steps:
  • Step 101 Perform distortion correction on the original image of the calibration image based on the internal reference of the camera to obtain a distortion correction map.
  • the original picture of the calibration picture is a picture obtained by photographing a scene with four straight lines preset by a camera, and the closed area surrounded by the four straight lines in the scene is a rectangle.
  • the four straight lines set in advance in this step may be four straight lines in which the enclosed closed area drawn in the scene before the camera external reference calibration is performed (as shown in FIG. 2, l1, l2, l3, and l4 are
  • the closed area drawn in the scene is four straight lines, A, B, C, and D are the intersections of the above four lines; or it can be selected from the line where the outline of an object in the scene is located.
  • the closed area is a rectangular four lines. For example, when a rectangular building is included in the scene, the closed area enclosed by the straight line of the outline is a rectangle, so the straight line of the four sides of the building can be used as the The preset four lines in the step.
  • the original picture of the calibration picture After shooting a scene containing a preset line, the original picture of the calibration picture is obtained. Due to the shooting angle of view and the internal reference of the camera, for example, the camera is deviated during lens manufacturing and assembly of various components, the imaging process may cause image distortion. This causes distortion of the image. Therefore, distortion occurs in the original image of the calibration image. For example, the straight line in the real scene corresponds to the original image of the calibration image, and may become a curved curve. Therefore, it is necessary to perform distortion correction on the original picture of the calibration picture according to the internal reference of the camera to obtain a distortion correction map.
  • Step 102 In the distortion correction map, acquire pixel coordinates of four intersection points formed by four straight lines.
  • the pixel coordinates of the four intersection points are the coordinates of the four intersection points in the principal point pixel coordinate system.
  • the principal point pixel coordinate system in the embodiment of the present application is established based on the pixel coordinate system.
  • the pixel coordinate system is located on the imaging plane of the camera, and its origin is located in the upper left corner of the picture.
  • the unit of measurement is pixels.
  • the pixel coordinates of the four intersection points are the coordinates of the four intersection points in the pixel coordinate system, and the pixel coordinates of the intersection point are the intersection points.
  • the coordinates in the pixel coordinate system for example, for a certain pixel in the picture, when its coordinate is (x, y), it means that the pixel is located in the xth and yth columns of the image.
  • the pixel coordinate system of the main point in this step is also located on the imaging plane of the camera, and the unit of measurement is also a pixel, but different from the pixel coordinate system: the origin of the pixel coordinate system of the main point is located at the main point of the camera, and the X-axis and the Y-axis are respectively It is parallel to the X and Y axes of the pixel coordinate system.
  • the coordinates in the pixel coordinate system and its coordinates in the principal point pixel coordinate system have the following correspondence:
  • (cx, cy) is the coordinate of the camera's principal point in the pixel coordinate system
  • (x m , y m ) is the coordinate of the point M in the principal point pixel coordinate system, that is, the pixel coordinate of the point M
  • (x M , y M ) is the coordinates of the point M in the pixel coordinate system.
  • the manner of obtaining the pixel coordinates of the four intersection points formed by the four straight lines may be directly extracting the pixel coordinates of the four intersection points from the distortion correction map, or extracting other features in the distortion correction map first, and then performing coordinate calculation.
  • the pixel coordinates of the above four intersection points are obtained indirectly.
  • the step of acquiring pixel coordinates of four intersection points formed by four straight lines may include:
  • the straight line equations of the four straight lines in the pixel coordinate system of the main point are respectively obtained by straight line fitting; the four intersection points formed by the four straight lines are calculated according to the straight line equation of the four straight lines in the pixel coordinate system of the main point.
  • the pixel coordinates are respectively obtained by straight line fitting; the four intersection points formed by the four straight lines are calculated according to the straight line equation of the four straight lines in the pixel coordinate system of the main point.
  • the straight line equation of the above four straight lines in the pixel coordinate system of the main point can be obtained by straight line fitting (the straight line equations of the four straight lines obtained by the straight line fitting method in the main point pixel coordinate system are on the straight line
  • the accuracy of the linear equation determined by the two points is high, and then the coordinates of the four intersection points are obtained by coordinate calculation.
  • the pixel coordinates of the four intersection points obtained at this time are unique and the accuracy is high.
  • Step 103 Calibrate the camera external parameter by a preset coordinate transformation strategy based on the pixel coordinates of the four intersection points.
  • Camera external parameters can include camera roll angle, camera pitch angle, camera yaw angle, and can also include the camera's mounting height and camera position.
  • C is the main point of the camera
  • CO is the optical axis of the camera
  • O-XYZ is the camera local world coordinate system with the intersection of the camera optical axis and the ground as the origin
  • t is the elevation angle
  • p is the yaw angle
  • s is the roll angle
  • h is the camera mount height.
  • the camera local world coordinate system, pixel coordinate system, main point pixel coordinate system and world coordinate system are involved.
  • the pixel coordinates of the four intersection points are the parameters in the main point pixel coordinate system, and the camera roll angle
  • the camera's pitch angle, the camera's yaw angle, the camera's mounting height, and the camera's position are parameters in the camera's local world coordinate system or world coordinate system. Therefore, the preset coordinate transformation strategy can be understood as the mapping between the above coordinate systems. Relationship, through the mapping relationship, the above external parameters can be obtained, and the camera external reference calibration is completed.
  • the camera roll angle, the camera pitch angle, the camera yaw angle, the camera installation height, and the camera position can be calculated, or can be applied according to requirements or applications. It is necessary to selectively calculate a part of the above external parameters.
  • the distortion correction is performed on the original image of the calibration image based on the camera internal reference, and the distortion correction map is obtained; in the distortion correction diagram, four four lines are obtained. Pixel coordinates of the intersection point; based on the pixel coordinates of the four intersection points, the camera external parameters are calibrated by a preset coordinate transformation strategy. Obtaining the pixel coordinates of the four intersections formed by the four straight lines enclosing the closed rectangular area in the scene in the distortion correction map, and by using a preset coordinate transformation strategy, the camera external parameters can be calibrated without iterative processing, and the calculation process Stable, computationally short, and improved the accuracy of the camera external calibration results.
  • the camera external reference including the camera roll angle, the camera pitch angle, the camera yaw angle, the camera installation height, and the camera position are taken as an example to introduce a camera external reference calibration method provided by the embodiment of the present application, as shown in FIG. 4 .
  • the specific steps may include the following steps:
  • Step 201 Perform distortion correction on the original image of the calibration image based on the internal reference of the camera to obtain a distortion correction map.
  • the camera has deviations in lens manufacturing and assembly of various components. Therefore, the imaging process causes image distortion, which causes distortion of the original image. For example, after the original straight line is imaged by the camera, it becomes a curved curve in the image. Therefore, it is necessary to use the internal reference of the camera to correct the distortion of the captured picture.
  • the camera internal parameters include: camera focal length, coordinates of the camera's main point in the pixel coordinate system, and distortion parameters.
  • the process of distortion correction is: for a certain pixel, the corresponding relationship of the position of the point in the picture before and after the distortion is found, and the pixel value of the point in the distorted picture is re-assigned to the pre-distortion picture, that is, there is no distortion phenomenon.
  • the corresponding position in the picture under ideal conditions, the specific process is as follows:
  • the coordinates (u, v) of each point in the imaging plane in the pixel coordinate system under ideal conditions are calculated by the following formula:
  • the pixel value corresponding to (u d , v d ) is assigned to (u, v) to obtain the pixel value of each point in the distortion correction map:
  • f x and f y are camera focal lengths
  • (c x , c y ) are coordinates of the camera's main point in the pixel coordinate system
  • k 1 , k 2 , k 3 , p 1 and p 2 are distortion parameters
  • (X) C , Y C , Z C ) are the coordinates of each point in the camera coordinate system.
  • the Z axis of the camera coordinate system is the camera optical axis
  • the X axis and the Y axis are respectively parallel to the X axis and the Y axis of the principal point pixel coordinate system.
  • Step 202 In the distortion correction map, acquire pixel coordinates of four intersection points formed by four straight lines.
  • the pixel coordinates of the four intersection points are the coordinates of the four intersection points in the principal point pixel coordinate system.
  • Steps 201 and 202 respectively correspond to the specific content in step 101 and step 102, and details are not described herein again.
  • Step 203 Calculate a camera roll angle based on the pixel coordinates of the four intersection points, using a rectangular region-to-edge equal property and a preset mapping relationship between the principal point pixel coordinate system and the camera local world coordinate system.
  • the calculation formula of the camera roll angle can be obtained according to the use of the opposite nature of the rectangular region and the preset mapping relationship between the main point pixel coordinate system and the camera local world coordinate system, and the formula can be:
  • ⁇ AB x B -x A
  • ⁇ AB y B -y A
  • ⁇ AB x A y B -x B y A
  • ⁇ AC x C -x A
  • BD BD x B y D - x D y B
  • ⁇ CD x D - x C
  • ⁇ CD y D - y C
  • ⁇ CD x C y D - x D y C
  • (x A , y A ), (x B , y B ), (x C , y C ), and (x D , y D ) are pixel coordinates of four intersection points, respectively; s is the camera roll angle.
  • step 204 the camera roll angle is corrected, and the corrected pixel coordinates of the four intersection points are calculated.
  • the corrected pixel coordinates of the four intersection points are the pixel coordinates of the four intersection points on the distortion correction map without the camera roll angle.
  • the correction calculation of the roll angle of the camera may be performed by changing the camera coordinate of the camera to a value of 0, and transforming the pixel coordinates of the four intersection points, and correcting the pixel coordinates is the coordinate obtained after the transformation. That is, the pixel coordinates of the four intersection points after the camera roll angle is corrected to zero are calculated.
  • the camera roll angle is corrected and the corrected pixel coordinates of the four intersection points are calculated, including:
  • the corrected calculation formula is used to calculate the corrected pixel coordinates of the four intersection points.
  • the correction formula is:
  • Step 205 Determine pixel coordinates of the specified vanishing point according to the corrected pixel coordinates of the four intersection points.
  • the specified vanishing point is the intersection formed by any set of non-parallel opposite sides of the quadrilateral formed by the four intersection points on the distortion correction map without the camera roll angle, that is, in the case where the camera roll angle is 0, the distortion correction map is four An intersection formed by any set of non-parallel opposite sides of a quadrilateral composed of intersections.
  • Step 206 Extract the longitudinal focal length of the camera from the camera internal reference, and extract the ordinate of the specified vanishing point from the pixel coordinates of the specified vanishing point.
  • Step 207 calculating a camera pitch angle based on the longitudinal focal length of the camera and the ordinate of the designated vanishing point.
  • Step 208 extracting the lateral focal length of the camera from the camera internal reference, and extracting the abscissa of the specified vanishing point from the pixel coordinates of the specified vanishing point.
  • Step 209 Calculate the camera yaw angle based on the lateral focal length of the camera, the abscissa specifying the vanishing point, and the camera pitch angle.
  • the camera pitch angle can be calculated by the following camera pitch angle calculation formula:
  • t is the camera's pitch angle
  • f y is the camera's longitudinal focal length
  • v 0 is the ordinate of the specified vanishing point (P).
  • the camera yaw angle can be calculated by the following camera yaw angle calculation formula:
  • p is the camera yaw angle
  • t is the camera pitch angle
  • f x is the camera's lateral focal length
  • u 0 is the abscissa of the specified vanishing point (P).
  • the two sets of opposite sides in the x-th and y-axis directions are not parallel, that is, there are two vanishing points in the x-axis and y-axis directions.
  • the camera pitch angle and the yaw angle can be calculated by the coordinates of any of the above two vanishing points, wherein when the camera pitch angle and the yaw angle are calculated by the vanishing point in the y-axis direction, the camera pitch angle can be directly adopted.
  • Calculation formula and camera yaw angle calculation formula when calculating the camera pitch angle and yaw angle through the vanishing point in the x-axis direction, the calculation formula similar to the above formula can be derived based on the relative position relationship between the vanishing point rule and the coordinate system. The calculation formula is not repeated here.
  • p is the camera yaw angle
  • f x is the lateral focal length of the camera
  • u 1 is the abscissa of the (P') point.
  • Step 210 Acquire a distance between two specified straight lines of the four straight lines in the scene.
  • the two specified straight lines are two straight lines in which there is a specified vanishing point in the distortion correction map.
  • Step 211 calculating an intercept after the horizontal axis of the principal point pixel coordinate system intersects with two specified straight lines based on the corrected pixel coordinates of the four intersection points.
  • Step 212 calculating the camera mounting height based on the distance between the two specified straight lines, the lateral focal length of the camera, the camera yaw angle, the intercept, and the camera pitch angle.
  • the camera installation height can be calculated by the following trigonometric relationship:
  • f x is the lateral focal length of the camera
  • w is the distance between two specified straight lines of the four lines
  • h is the camera mounting height
  • t is the camera pitch angle
  • p is the camera yaw angle
  • is the main point pixel coordinate system
  • C point is the camera main point
  • x is the horizontal axis of the pixel coordinate system of the main point
  • the lines A'B', A'D', A 'C' and B'D' are four straight lines determined by the corrected pixel coordinates of the four intersections of the four straight lines preset in the scene.
  • Step 213 Acquire coordinates of the first intersection of the four intersection points generated by the four straight lines in the world coordinate system.
  • the first intersection point is any intersection of four intersection points. Since the camera can be placed anywhere in the environment, a reference coordinate system is selected in the environment to describe the position of the camera and used to describe the position of any object in the environment, which is the world coordinate system.
  • Step 214 calculating coordinates of the camera in the camera local world coordinate system based on the camera installation height, the camera pitch angle, and the camera yaw angle.
  • the origin of the camera's local world coordinate system is the intersection of the camera's optical axis and the ground.
  • the X, Y, and Z axes are parallel to the X, Y, and Z axes of the world coordinate system, as shown in Figure 9, where O- XYZ is the camera local world coordinate system, and O w -X w Y w Z w is the world coordinate system.
  • the coordinates of the camera in the local world coordinate system of the camera can be calculated by using the following trigonometric relationship:
  • (xcam, ycam, zcam) is the coordinates of the camera in the camera's local world coordinate system; h is the camera installation height; t is the camera pitch angle; p is the camera yaw angle.
  • Step 215 based on the coordinates of the first intersection point in the pixel coordinate system of the principal point, the camera roll angle, the camera pitch angle, the camera yaw angle, the lateral focal length of the camera, the longitudinal focal length of the camera, and the coordinates of the camera in the camera local world coordinate system.
  • the coordinates of the camera's main point in the pixel coordinate system and calculate the coordinates of the first intersection point in the camera's local world coordinate system.
  • the coordinate transformation relationship formula between the camera local world coordinate system and the pixel coordinate system in the distortion correction map may be used to calculate the coordinates of the specified intersection point in the camera local world coordinate system:
  • (u, ⁇ ) is the pixel coordinate of the first intersection point in the pixel coordinate system of the distortion correction map
  • (x w , y w , 0) is the coordinate of the first intersection point in the camera local world coordinate system
  • is a scaling factor that characterizes a scaling relationship between a pixel coordinate system and a camera local world coordinate system
  • p is the camera yaw angle
  • t is the camera pitch angle
  • s is the camera roll angle
  • (cx, cy) is the pixel coordinate of the camera's main point in the distortion correction image pixel coordinate system
  • f x is the camera's lateral focal length
  • f y The longitudinal focal length of the camera
  • (x w , y w , 0) is the coordinate of the first intersection point in the camera local world coordinate system.
  • Step 216 Calculate the coordinates of the camera local world coordinate system origin in the world coordinate system according to the coordinates of the first intersection point in the camera local world coordinate system and the coordinates of the first intersection point in the world coordinate system.
  • the coordinates of the camera's local world coordinate system origin in the world coordinate system can be calculated by the following coordinate transformation formula:
  • (x w , y w , 0) is the coordinate of the first intersection point in the camera local world coordinate system;
  • (x 0 , y 0 , z 0 ) is the coordinate of the camera local world coordinate system origin in the world coordinate system;
  • (X aw , Y aw , 0) is the coordinate of the first intersection point in the world coordinate system.
  • Step 217 Calculate the coordinates of the camera in the world coordinate system based on the coordinates of the camera local world coordinate system origin in the world coordinate system and the coordinates of the camera in the camera local world coordinate system.
  • (x 0 , y 0 , z 0 ) is the coordinate of the camera's local world coordinate system origin in the world coordinate system;
  • (xcam, ycam, zcam) is the coordinates of the camera in the camera's local world coordinate system;
  • (X cam , Y cam , Z cam ) is the coordinates of the camera in the world coordinate system.
  • part of the camera external reference in the camera roll angle, the camera pitch angle, the camera yaw angle, the camera installation height, and the camera position may be calculated, and details are not described herein again.
  • the camera external reference calibration method shown in FIG. 4 based on the camera internal reference, the original image of the calibration picture is corrected by distortion, and a distortion correction map is obtained; in the distortion correction diagram, four four lines are obtained. Pixel coordinates of the intersection point; based on the pixel coordinates of the four intersection points, the camera external parameters are calibrated by a preset coordinate transformation strategy. Obtaining the pixel coordinates of the four intersections formed by the four straight lines enclosing the closed rectangular area in the scene in the distortion correction map, and the camera external parameters can be calibrated by the preset coordinate transformation strategy without the iterative processing, the calculation process Stable, short calculation time, improve the speed of the camera external calibration and the accuracy of the calibration results.
  • a camera external reference calibration method according to the above embodiment of the present application, correspondingly, the embodiment of the present application provides a camera external reference calibration device, and a schematic structural diagram thereof is shown in FIG. 10, including:
  • the distortion correction map acquisition module 301 is configured to perform distortion correction on the original image of the calibration image based on the camera internal reference, and obtain a distortion correction map, and the original image of the calibration image is a picture obtained by photographing a scene with four straight lines preset by the camera, four The closed area enclosed by the line in the scene is a rectangle.
  • the pixel coordinate obtaining module 302 is configured to acquire, in the distortion correction map, pixel coordinates of four intersection points formed by four straight lines, and the pixel coordinates of the four intersection points are coordinates of four intersection points in the primary point pixel coordinate system.
  • the external parameter calibration module 303 based on the pixel coordinates of the four intersection points, calibrates the camera external parameters by a preset coordinate transformation strategy.
  • the pixel coordinate acquisition module 302 is specifically configured to:
  • the straight line equations of the four straight lines in the pixel coordinate system of the main point are respectively obtained by straight line fitting; the four intersection points formed by the four straight lines are calculated according to the straight line equation of the four straight lines in the pixel coordinate system of the main point.
  • the pixel coordinates are respectively obtained by straight line fitting; the four intersection points formed by the four straight lines are calculated according to the straight line equation of the four straight lines in the pixel coordinate system of the main point.
  • the camera external reference includes a camera roll angle
  • the external parameter calibration module 303 is specifically configured to: calculate a camera roll angle based on pixel coordinates of four intersection points, using a rectangular region-to-edge equal property and a preset mapping relationship between the principal point pixel coordinate system and the camera local world coordinate system.
  • the external parameter calibration module 303 is further configured to: correct the camera roll angle, calculate the corrected pixel coordinates of the four intersection points, and correct the pixel coordinates of the four intersection points to the distortion correction map without the camera roll angle.
  • the pixel coordinates of the upper four intersection points; the pixel coordinates of the specified vanishing point are determined according to the corrected pixel coordinates of the four intersection points, and any one of the four quadrilaterals formed by the four intersection points on the distortion correction map with the vanishing point is specified as the non-parallel correction angle is non-parallel
  • the intersection formed by the opposite side extracts the longitudinal focal length of the camera from the camera internal reference, and extracts the ordinate of the specified vanishing point from the pixel coordinates of the specified vanishing point; calculates the camera pitch angle based on the longitudinal focal length of the camera and the ordinate of the specified vanishing point .
  • the external parameter calibration module 303 is specifically configured to:
  • the corrected calculation formula is used to calculate the corrected pixel coordinates of the four intersection points.
  • the correction formula is:
  • the external parameter calibration module 303 is further configured to: extract a lateral focal length of the camera from the camera internal reference, and extract an abscissa of the specified vanishing point from the pixel coordinates of the specified vanishing point; based on the lateral focal length of the specified camera, Calculate the camera yaw angle by the abscissa of the vanishing point and the camera pitch angle.
  • the external parameter calibration module 303 is further configured to: acquire a distance between two specified straight lines in the four straight lines in the scene, where the two specified straight lines are two specified escape points in the distortion correction map. a straight line; based on the corrected pixel coordinates of the four intersection points, the intercept of the horizontal axis of the principal point pixel coordinate system after intersecting the two specified straight lines; the distance between the two specified straight lines based on the four straight lines, the lateral focal length of the camera, and the camera The yaw angle, intercept, and camera pitch angle are used to calculate the camera mounting height.
  • the external parameter calibration module 303 is further configured to: acquire coordinates of a first intersection of the four intersections generated by the four straight lines in the world coordinate system, and the first intersection point is any of the four intersection points.
  • An intersection point ; calculating the coordinates of the camera in the camera local world coordinate system based on the camera installation height, the camera pitch angle, and the camera yaw angle; the coordinates based on the first intersection point in the principal point pixel coordinate system, the camera roll angle, and the camera pitch angle , the camera yaw angle, the camera's lateral focal length, the camera's longitudinal focal length, the coordinates of the camera in the camera's local world coordinate system, the camera's main point coordinates in the pixel coordinate system, and calculate the first intersection point in the camera's local world coordinate system.
  • Coordinates based on the coordinates of the first intersection point in the world coordinate system, the coordinates of the first intersection point in the world coordinate system, calculate the coordinates of the camera local world coordinate system origin in the world coordinate system; based on the camera local world coordinate system origin in the world coordinates The coordinates in the system, the coordinates of the camera in the camera's local world coordinate system, and the coordinates of the camera in the world coordinate system are calculated.
  • the distortion correction map acquisition module 301 performs distortion correction on the original image of the calibration image based on the camera internal reference to obtain a distortion correction map; and the pixel coordinate acquisition module 302 acquires four in the distortion correction map.
  • the camera external parameters can be calibrated without iterative processing, and the calculation process Stable, short calculation time, improve the speed of the camera external calibration and the accuracy of the calibration results.
  • the embodiment of the present application further provides an electronic device, as shown in FIG. 11, including a processor 401 and a memory 402, wherein
  • the memory 402 is configured to store a computer program.
  • the processor 401 is configured to implement the camera external reference calibration method provided by the embodiment of the present application when the program stored in the memory 402 is executed.
  • the original image of the calibration image is corrected for distortion, and the distortion correction map is obtained.
  • the original image of the calibration image is a picture obtained by shooting a scene with four straight lines set by a camera, and a closed line surrounded by four straight lines in the scene. It is a rectangle.
  • the pixel coordinates of the four intersection points formed by the four straight lines are obtained, and the pixel coordinates of the four intersection points are the coordinates of the four intersection points in the principal point pixel coordinate system.
  • the camera external parameters are calibrated by a preset coordinate transformation strategy.
  • the step of acquiring pixel coordinates of four intersection points formed by four straight lines may include:
  • the straight line equations of the four straight lines in the pixel coordinate system of the main point are respectively obtained by straight line fitting.
  • the pixel coordinates of the four intersections formed by the four straight lines are calculated according to the straight line equation of the four straight lines in the pixel coordinate system of the principal point.
  • the camera external reference includes a camera roll angle.
  • the steps of calibrating the camera external parameters by using a preset coordinate transformation strategy may include:
  • the camera roll angle is calculated by using the rectangular region to the edge equal property and the preset mapping relationship between the principal point pixel coordinate system and the camera local world coordinate system.
  • the step of calculating the camera roll angle based on the pixel coordinates of the four intersection points using the rectangular region-to-edge equality property and the preset mapping relationship between the principal point pixel coordinate system and the camera local world coordinate system, It can also include:
  • Correction calculation is performed on the camera roll angle, and the corrected pixel coordinates of the four intersection points are calculated.
  • the corrected pixel coordinates of the four intersection points are the pixel coordinates of the four intersection points on the distortion correction map without the camera roll angle.
  • the pixel coordinates of the designated vanishing point are determined, and the intersection point formed by any set of non-parallel opposite sides of the quadrilateral composed of four intersection points on the distortion correction map with no camera roll angle is specified.
  • the longitudinal focal length of the camera is extracted from the camera internal reference, and the ordinate of the specified vanishing point is extracted from the pixel coordinates of the specified vanishing point.
  • the camera pitch angle is calculated based on the longitudinal focal length of the camera and the ordinate of the designated vanishing point.
  • the step of correcting the roll angle of the camera and calculating the corrected pixel coordinates of the four intersection points may include:
  • the corrected calculation formula is used to calculate the corrected pixel coordinates of the four intersection points.
  • the correction formula is:
  • the step of calculating the camera pitch angle based on the longitudinal focal length of the camera and the ordinate of the designated vanishing point it is further included.
  • the lateral focal length of the camera is extracted from the camera internal reference, and the abscissa of the specified vanishing point is extracted from the pixel coordinates of the specified vanishing point.
  • the camera yaw angle is calculated based on the lateral focal length of the camera, the abscissa specifying the vanishing point, and the camera pitch angle.
  • the method may further include:
  • the two specified lines correspond to the two lines that have the specified vanishing point in the distortion correction chart.
  • the intercept of the horizontal axis of the principal point pixel coordinate system and the two specified straight lines is calculated.
  • the camera mount height is calculated based on the distance between two specified straight lines of the four lines, the lateral focal length of the camera, the camera yaw angle, the intercept, and the camera pitch angle.
  • the method may further include:
  • the coordinates of the camera in the camera's local world coordinate system are calculated based on the camera mount height, camera pitch angle, and camera yaw angle.
  • the camera roll angle, the camera pitch angle, the camera yaw angle, the lateral focal length of the camera, the longitudinal focal length of the camera, the coordinates of the camera in the camera local world coordinate system, the camera master Based on the coordinates of the point in the pixel coordinate system are calculated, and the coordinates of the first intersection point in the camera local world coordinate system are calculated.
  • the coordinates of the origin of the camera local world coordinate system in the world coordinate system are calculated according to the coordinates of the first intersection point in the camera local world coordinate system and the coordinates of the first intersection point in the world coordinate system.
  • the coordinates of the camera in the world coordinate system are calculated based on the coordinates of the camera local world coordinate system origin in the world coordinate system and the coordinates of the camera in the camera local world coordinate system.
  • the memory may include a random access memory (English: Random Access Memory, RAM for short), and may also include a non-volatile memory (Non-volatile Memory, NVM for short), such as at least one disk storage. Further, the memory may also be at least one storage device located away from the aforementioned processor.
  • a random access memory English: Random Access Memory, RAM for short
  • NVM non-volatile Memory
  • the memory may also be at least one storage device located away from the aforementioned processor.
  • the above processor may be a general-purpose processor, including a central processing unit (English: Central Processing Unit, CPU for short), a network processor (English: Network Processor, NP for short), or a digital signal processor. :Digital Signal Processor (DSP), Application Specific Integrated Circuit (ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices , discrete gates or transistor logic devices, discrete hardware components.
  • a central processing unit English: Central Processing Unit, CPU for short
  • a network processor English: Network Processor, NP for short
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the processor 401 and the memory 402 may be connected by a communication bus such as an address bus, a data bus, a control bus, etc., and the communication bus may be a Peripheral Component Interconnect (PCI) bus or an extended industry standard structure (English: English: Extended Industry Standard Architecture, referred to as EISA) bus.
  • PCI Peripheral Component Interconnect
  • EISA Extended Industry Standard Architecture
  • Communication between the processor 401 and the memory 402, and between the electronic device and other external devices can also be performed by wireless connection of the wireless module.
  • the method is: performing distortion correction on the original image of the calibration image based on the camera internal reference, and obtaining a distortion correction map, and the original image of the calibration image is a scene with four straight lines preset through the camera.
  • the picture obtained by the shooting, the closed area enclosed by the four straight lines in the scene is a rectangle; in the distortion correction map, the pixel coordinates of the four intersection points formed by the four straight lines are obtained, and the pixel coordinates of the four intersection points are four intersection points at the main point.
  • the coordinates in the pixel coordinate system; based on the pixel coordinates of the four intersection points, the camera external parameters are calibrated by a preset coordinate transformation strategy.
  • the pixel coordinates of the four intersections formed by the four straight lines enclosing the closed rectangular region in the scene can be obtained in the distortion correction map, and the camera external parameters can be calibrated by the preset coordinate transformation strategy without passing through Iterative processing, the calculation process is stable, the calculation time is short, and the speed of the camera external reference calibration and the accuracy of the calibration result are improved.
  • a computer readable storage medium having stored therein instructions that, when run on a computer, cause the computer to perform any of the above embodiments
  • the above-mentioned camera external reference calibration method is provided.
  • the method is: performing distortion correction on the original image of the calibration image based on the camera internal reference, and obtaining a distortion correction map, and the original image of the calibration image is preset by the camera pair.
  • the coordinates in the main point pixel coordinate system; based on the pixel coordinates of the four intersection points, the camera external parameters are calibrated by a preset coordinate transformation strategy.
  • the pixel coordinates of the four intersections formed by the four straight lines enclosing the closed rectangular region in the scene can be obtained in the distortion correction map, and the camera external parameters can be calibrated by the preset coordinate transformation strategy without passing through Iterative processing, the calculation process is stable, the calculation time is short, and the speed of the camera external reference calibration and the accuracy of the calibration result are improved.

Abstract

一种相机外参标定方法、装置及电子设备,包括:基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图;在畸变矫正图中,获取四条直线形成的四个交点的像素坐标;基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。通过本方案可以提高相机外参标定的速度和标定结果的精确度。

Description

一种相机外参标定方法、装置及电子设备
本申请要求于2018年03月30日提交中国专利局、申请号为201810298452.0、发明名称为“一种相机外参标定方法、装置及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及相机标定技术领域,特别是涉及一种相机外参标定方法、装置及电子设备。
背景技术
相机外参是一组表征相机在世界坐标系中的位置、旋转方向等属性的参数,包括相机的滚转角、俯仰角及偏航角等,相机外参标定就是获得相机外参的过程。在视觉测量领域,相机外参标定是一个非常关键的环节,其标定精度和稳定性将直接影响视觉测量结果的准确性。
视觉测量技术的应用场景较为广泛,例如,可应用于产品质量检测、车辆监控等场景中。相关技术中,针对车辆监控的场景,相应的相机外参标定方法为:从实时图像中进行车道提取,然后对提取到的车道进行透视变换去除操作,再对透视变换去除后的结果进行最优值迭代,得到相机的滚转角、俯仰角及偏航角。
采用上述方法进行相机外参标定时,由于需要进行迭代处理,因此,计算过程不稳定,计算耗时长,计算结果精确度低,且有可能出现计算结果不收敛,即无法得到相机外参数的情况。
发明内容
本申请提供了一种相机外参标定方法、装置及电子设备,以提高相机外参标定的速度和标定结果的精确度。具体技术方案如下:
第一方面,本申请实施例提供了一种相机外参标定方法,所述方法包括:
基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,所述标定图片原图为通过所述相机对预先设定有四条直线的场景进行拍摄得到的图片,所述四条直线在所述场景中围成的闭合区域为矩形;
在所述畸变矫正图中,获取所述四条直线形成的四个交点的像素坐标,所述四个交点的像素坐标为所述四个交点在主点像素坐标系中的坐标;
基于所述四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
第二方面,本申请实施例提供了一种相机外参标定装置,装置包括:
畸变校正图获取模块,用于基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,所述标定图片原图为通过所述相机对预先设定有四条直线的场景进行拍摄得到的图片,所述四条直线在所述场景中围成的闭合区域为矩形;
像素坐标获取模块,用于在所述畸变矫正图中,获取所述四条直线形成的四个交点的像素坐标,所述四个交点的像素坐标为所述四个交点在主点像素坐标系中的坐标;
外参标定模块,基于所述四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
第三方面,本申请实施例提供了一种电子设备,包括处理器和存储器;
存储器,用于存放计算机程序;
处理器,用于执行存储器上所存放的程序时,实现上述任一相机外参标定方法的步骤。
第四方面,本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机执行上述任一相机外参标定方法。
由上述的技术方案可见,基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,其中,标定图片原图为通过相机对预先设定有四条直线的场景进行拍摄得到的图片,四条直线在场景中围成的闭合区域为矩形;在畸变矫正图中,获取四条直线形成的四个交点的像素坐标,其中,四个交点的像素坐标为四个交点在主点像素坐标系中的坐标;基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。在畸变校正图中获取上述在所拍摄的场景中围成闭合矩形区域的四条直线形成的四个交点的像素坐标,通过预设的坐标变换策略,可以标定出相机外参,而无需通过迭代处理,计算过程稳定,计算耗时短,提高了相机外参标定的速度和标定结果的准确度。
附图说明
图1为本申请一实施例提供的相机外参标定方法的流程示意图;
图2为在场景中绘制的围成的闭合区域为矩形的四条直线的示意图;
图3为相机外参所表示的几何含义的示意图;
图4为本申请另一实施例提供的相机外参标定方法的流程示意图;
图5为本申请一实施例的指定消失点示意图;
图6为本申请另一实施例的指定消失点示意图;
图7为本申请又一实施例的指定消失点示意图;
图8为主点像素坐标系的横轴与两条指定直线相交后的截距的示意图;
图9为相机局部世界坐标系与世界坐标系相对位置关系示意图;
图10为本申请实施例提供的相机外参标定装置的结构示意图;
图11本申请实施例提供的电子设备的结构示意图。
具体实施方式
为了提高相机外参标定结果的准确度,本申请实施例提供了一种相机外参标定方法、装置及电子设备。
下面对本申请实施例所提供的一种相机外参标定方法进行介绍。
该相机外参标定方法的执行主体可以为包含核心处理芯片的相机,也可以是独立于相机之外的控制设备。
如图1所示,本申请实施例所提供的一种相机外参方法,具体可以包括如下步骤:
步骤101,基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图。
其中,标定图片原图为通过相机对预先设定有四条直线的场景进行拍摄得到的图片,四条直线在场景中围成的闭合区域为矩形。
本步骤中预先设定的四条直线可以是在进行相机外参标定前在场景中绘制的围成的闭合区域为矩形的四条直线(如图2所示,l1、l2、l3及l4即为在场景中绘制的围成的闭合区域为矩形的四条直线,A、B、C、D为上述四条直线的交点);也可以是从场景中某个物体的轮廓线所在直线中选择出的围成的闭合区域为矩形的四条直线,例如,当场景中包含有一座矩形建筑物时,由于其轮廓线所在直线围成的闭合区域即为矩形,因此可以将该建筑物的四条边所在直线作为本步骤中的预设四条直线。
对包含有预设直线的场景进行拍摄后,得到标定图片原图,由于拍摄视角及相机内参等原因,例如:相机在透镜制造及各部件组装过程中存在偏差, 因此,成像过程会引起图像畸变,导致图像的失真,因此,标定图片原图中会出现畸变现象,例如,实景中的直线对应在标定图片原图中,则可能变成弯曲的曲线。因此,需要根据相机的内参,对标定图片原图进行畸变矫正,得到畸变矫正图。
步骤102,在畸变矫正图中,获取四条直线形成的四个交点的像素坐标。
其中,四个交点的像素坐标为四个交点在主点像素坐标系中的坐标。
本申请实施例中的主点像素坐标系是以像素坐标系为基础建立的,在介绍主点像素坐标系之前,先明确像素坐标系的定义。像素坐标系位于相机的成像平面上,其原点位于图片的左上角,度量单位为像素,其中,四个交点的像素坐标为四个交点在像素坐标系中的坐标,交点的像素坐标即为交点在像素坐标系中的坐标,例如,对于图片中的某一像素点,当其坐标为(x,y)时,即表示该像素点位于图像的第x行第y列。
本步骤中的主点像素坐标系也位于相机的成像平面上,度量单位亦为像素,但与像素坐标系不同的是:主点像素坐标系原点位于相机主点,其X轴、Y轴分别与像素坐标系的X轴、Y轴平行。
因此,对于畸变矫正图中的指定一点M,其在像素坐标系中的坐标与其在主点像素坐标系中的坐标之间具有如下对应关系:
x m=x M-cx
y m=cy-y M
其中,(cx,cy)为相机主点在像素坐标系中的坐标;(x m,y m)为点M在主点像素坐标系中的坐标,即点M的像素坐标;(x M,y M)为点M在像素坐标系中的坐标。
获取四条直线形成的四个交点的像素坐标的方式,可以是直接从畸变矫正图中提取出上述四个交点的像素坐标,也可以是先提取出畸变矫正图中其他特征,再通过坐标运算,间接得到上述四个交点的像素坐标。
在一种示例中,在畸变矫正图中,获取四条直线形成的四个交点的像素坐标的步骤,可以包括:
在畸变矫正图中,通过直线拟合的方式,分别获取四条直线在主点像素坐标系中的直线方程;根据四条直线在主点像素坐标系中的直线方程,计算四条直线形成的四个交点的像素坐标。
由于图像分辨率等因素的限制,若直接从畸变矫正图中提取上述四个交 点的像素坐标,会导致结果不准确的问题。因此,可以先通过直线拟合的方式,得到上述四条直线在主点像素坐标系中的直线方程(通过直线拟合方法得到的四条直线在主点像素坐标系中的直线方程比通过直线上的两点确定出的直线方程的精确度高),然后再通过坐标计算,得到四个交点的像素坐标,此时得到的四个交点的像素坐标是唯一的,精确度较高。
步骤103,基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
相机外参可以包括相机滚转角、相机俯仰角、相机偏航角,还可以包括相机的安装高度以及相机的位置。如图3所示,其中C点为相机的主点,CO为相机的光轴,坐标系O-XYZ为以相机光轴与地面的交点为原点的相机局部世界坐标系,t为俯仰角,p为偏航角,s为滚转角,h为相机安装高度。
在进行相机外参标定时,涉及到相机局部世界坐标系、像素坐标系、主点像素坐标系及世界坐标系,四个交点的像素坐标为在主点像素坐标系中的参量,相机滚转角、相机俯仰角、相机偏航角、相机的安装高度以及相机的位置为在相机局部世界坐标系或者世界坐标系中的参量,因此,预设坐标变换策略可以理解为上述坐标系之间的映射关系,通过该映射关系,即可得到上述外参,完成相机外参标定。
采用本实施例提供的相机外参标定方法进行外参标定时,可以将相机滚转角、相机俯仰角、相机偏航角、相机的安装高度以及相机的位置均计算出来,也可以根据要求或者应用需要选择性的计算上述外参中的一部分。
在本申请实施例提供的图1所示的相机外参标定方法中,基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图;在畸变矫正图中,获取四条直线形成的四个交点的像素坐标;基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。在畸变校正图中获取上述在场景中围成闭合矩形区域的四条直线形成的四个交点的像素坐标,通过预设的坐标变换策略,可以标定出相机外参,而无需通过迭代处理,计算过程稳定,计算耗时短,且提高了相机外参标定结果的准确度。
下面,以相机外参包括相机滚转角、相机俯仰角、相机偏航角、相机的安装高度以及相机的位置为例,对本申请实施例提供的一种相机外参标定方法进行介绍,如图4所示,具体可以包括如下步骤:
步骤201,基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图。
相机在透镜制造及各部件组装过程中存在偏差,因此,成像过程会引起图像畸变,导致原始图像的失真,例如原本的直线经过相机成像以后,在图片中变为弯曲的曲线。因此,需要采用相机的内参对拍摄到的图片进行畸变矫正。
相机内参包括:相机焦距、相机主点在像素坐标系中的坐标以及畸变参数。
畸变矫正的过程为:对于某一像素点,找到畸变前后图片中该点位置的对应关系,并将畸变后图片中该点的像素值,重新赋给畸变前图片,也就是不存在畸变现象的理想状态下的图片中的对应位置,具体过程如下:
第一步,通过如下公式计算理想状态下成像平面中各点在像素坐标系下的坐标(u,v):
Figure PCTCN2019079569-appb-000001
畸变后各点的坐标为(u d,v d):
u d=f x·x"+c x v d=f y·y"+c y
其中,
Figure PCTCN2019079569-appb-000002
将(u d,v d)对应的像素值赋于(u,v)即可得到畸变矫正图中,各点的像素值:
f(u,v)=f(u d,v d)
其中,f x、f y为相机焦距,(c x,c y)为相机主点在像素坐标系中的坐标,k 1、k 2、k 3、p 1及p 2为畸变参数,(X C,Y C,Z C)为各点在相机坐标系中的坐标,相机坐标系的Z轴为相机光轴,X轴、Y轴分别与主点像素坐标系的X轴、Y轴平行。
步骤202,在畸变矫正图中,获取四条直线形成的四个交点的像素坐标。
其中,四个交点的像素坐标为四个交点在主点像素坐标系中的坐标。
步骤201、步骤202分别对应于步骤101、步骤102中的具体内容,在此不再赘述。
步骤203,基于四个交点的像素坐标,利用矩形区域对边相等性质及主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角。
本步骤中,可以根据利用矩形区域对边相等性质及主点像素坐标系与相机局部世界坐标系之间的预设映射关系,得到相机滚转角计算公式,该公式可以为:
Figure PCTCN2019079569-appb-000003
α AB=x B-x AAB=y B-y AAB=x Ay B-x By AAC=x C-x A,
其中,β AC=y C-y AAC=x Ay C-x Cy ABD=x D-x BBD=y D-y B,
χ BD=x By D-x Dy BCD=x D-x CCD=y D-y CCD=x Cy D-x Dy C
(x A,y A)、(x B,y B)、(x C,y C)及(x D,y D)分别为四个交点的像素坐标;s为相机滚转角。
步骤204,对相机滚转角进行修正运算,计算得到四个交点的修正像素坐标。
其中,四个交点的修正像素坐标为无相机滚转角的畸变矫正图上四个交点的像素坐标。
对相机滚转角的修正运算,可以是通过将相机的相机滚转角设定为0,对四个交点的像素坐标进行变换,修正像素坐标即为变换后得到的坐标。也就是,计算将相机滚转角修正为0后,四个交点的像素坐标。
在一种示例中,对相机滚转角进行修正运算,计算得到四个交点的修正像素坐标,包括:
根据相机滚转角,采用修正计算公式,计算得到四个交点的修正像素坐标,修正计算公式为:
x A'=cos s*x A+sin s*y A x B'=cos s*x B+sin s*y B
y A'=-sin s*x A+cos s*y A y B'=-sin s*x B+cos s*y B
x C'=cos s*x C+sin s*y C x D'=cos s*x D+sin s*y D
y C'=-sin s*x C+cos s*y C y D'=-sin s*x D+cos s*y D
其中,(x A,y A)、(x B,y B)、(x C,y C)及(x D,y D)分别为四个交点的像素坐标;(x A',y A')、(x B',y B')、(x C',y C')及(x D',y D')分别为四个交点的修正像素坐标;s为相机滚转角。
步骤205,根据四个交点的修正像素坐标,确定指定消失点的像素坐标。
其中,指定消失点为无相机滚转角的畸变矫正图上四个交点组成的四边形的任意一组非平行对边形成的交点,也就是相机滚转角为0的情况下,畸变矫正图上四个交点组成的四边形的任意一组非平行对边形成的交点。
步骤206,从相机内参中提取相机的纵向焦距,并从指定消失点的像素坐标中提取指定消失点的纵坐标。
步骤207,基于相机的纵向焦距及指定消失点的纵坐标,计算相机俯仰角。
步骤208,从相机内参中提取相机的横向焦距,并从指定消失点的像素坐标中提取指定消失点的横坐标。
步骤209,基于相机的横向焦距、指定消失点的横坐标及相机俯仰角,计算相机偏航角。
关于步骤205中的消失点,存在如下四种情况:
(一)、由四个交点组成的四边形中,只有y轴方向的一组对边非平行,也就是说只有y轴方向存在消失点,如图5所示,(P)点为直线A′C′与B′D′的交点,也就是本步骤中的指定消失点,(u0,v0)为(P)点的像素坐标。
针对此种情况,在步骤207中,可以通过如下相机俯仰角计算公式,计算相机俯仰角:
Figure PCTCN2019079569-appb-000004
其中:t为相机俯仰角;f y为相机的纵向焦距;v 0为指定消失点(P)的纵坐标。
在步骤209中,可以通过如下相机偏航角计算公式,计算相机偏航角:
Figure PCTCN2019079569-appb-000005
其中:p为相机偏航角;t为相机俯仰角;f x为相机的横向焦距;u 0为指定消失点(P)的横坐标。
(二)、由四个交点组成的四边形中,x周和y轴方向的两组对边均不平行,也就是说在x轴和y轴方向共存在2个消失点,针对此种情况,可以通过上述2个消失点中的任一个的坐标,计算相机俯仰角和偏航角,其中,当通过y轴方向的消失点计算相机俯仰角和偏航角时,可以直接采用上述相机俯仰角计算公式和相机偏航角计算公式;当通过x轴方向的消失点计算相机俯仰角和偏航角时,可以基于消失点法则及坐标系之间的相对位置关系,推导出类似于上述计算公式的计算公式,在此,不再赘述。
(三)、由四个交点组成的四边形中,只有x轴方向的一组对边非平行,也就是说只有x轴方向存在消失点,如图6所示,(P′)点为直线A′C′与B′D′的交点,也就是本步骤中的指定消失点,u 1为(P′)点的横坐标,v 1为(P′)点的纵坐标。针对此种情况,说明相机俯仰角为0,可以通过下述公式计算相机偏航角:
Figure PCTCN2019079569-appb-000006
其中,p为相机偏航角;f x为相机的横向焦距;u 1为(P′)点的横坐标。
(四)、由四个交点组成的四边形中,不存在非平行的对边,也就是说不存在消失点,如图7所示,此时,说明相机俯仰角和相机偏航角均为0。
步骤210,获取在场景中,四条直线中的两条指定直线间的距离。
其中,两条指定直线为在畸变矫正图中存在指定消失点的两条直线。
步骤211,基于四个交点的修正像素坐标,计算主点像素坐标系的横轴与两条指定直线相交后的截距。
步骤212,基于四条直线中的两条指定直线间的距离、相机的横向焦距、相机偏航角、截距及相机俯仰角,计算相机安装高度。
本步骤中可以通过如下三角函数关系,计算相机安装高度:
Figure PCTCN2019079569-appb-000007
其中:f x为相机的横向焦距;w为四条直线中的两条指定直线间的距离;h为相机安装高度;t为相机俯仰角;p为相机偏航角;δ为主点像素坐标系的横轴与两条指定直线相交后的截距,如图8所示,C点为相机主点,x为主点像 素坐标系的横轴,直线A′B′、A′D′、A′C′及B′D′为由场景中预设四条直线的四个交点的修正像素坐标确定的四条直线。
步骤213,获取四条直线在场景中生成的四个交点中的第一交点在世界坐标系中的坐标。
其中,上述第一交点为四个交点中的任一交点。由于相机可安放在环境中的任意位置,在环境中选择一个基准坐标系来描述相机的位置,并用它描述环境中任何物体的位置,该坐标系即为世界坐标系。
步骤214,基于相机安装高度、相机俯仰角及相机偏航角,计算相机在相机局部世界坐标系中的坐标。
相机局部世界坐标系的原点为相机光轴与地面的交点,X轴、Y轴及Z轴分别与世界坐标系的X轴、Y轴及Z轴平行,如图9所示,其中,O-XYZ为相机局部世界坐标系,O w-X wY wZ w为世界坐标系。
本步骤中,可以采用如下三角函数关系,计算相机在相机局部世界坐标系中的坐标:
xcam=h*cos p*cot t
ycam=h*sin p*cot t
zcam=h
其中,(xcam,ycam,zcam)为相机在相机局部世界坐标系中的坐标;h为相机安装高度;t为相机俯仰角;p为相机偏航角。
步骤215,基于第一交点在主点像素坐标系中的坐标、相机滚转角、相机俯仰角、相机偏航角、相机的横向焦距、相机的纵向焦距、相机在相机局部世界坐标系中的坐标、相机主点在像素坐标系中的坐标,计算第一交点在相机局部世界坐标系中的坐标。本步骤中,可以采用如下相机局部世界坐标系与畸变矫正图中的像素坐标系之间的坐标变换关系公式,计算上述指定交点在相机局部世界坐标系中的坐标:
Figure PCTCN2019079569-appb-000008
其中:(u,ν)为上述第一交点在畸变矫正图像素坐标系中的像素坐标;(x w,y w,0)为上述第一交点在相机局部世界坐标系中的坐标;λ为表征像素坐标系与相机局部世界坐标系间映射缩放比例关系的比例系数;
Figure PCTCN2019079569-appb-000009
p 为相机偏航角;t为相机俯仰角;s为相机滚转角;(cx,cy)为相机主点在畸变矫正图像素坐标系中的像素坐标;f x为相机的横向焦距;f y为相机的纵向焦距;(x w,y w,0)为上述第一交点在相机局部世界坐标系中的坐标。
步骤216,根据第一交点在相机局部世界坐标系中的坐标、第一交点在世界坐标系中的坐标,计算相机局部世界坐标系原点在世界坐标系中的坐标。
本步骤中,可以通过如下坐标变换公式,计算出相机局部世界坐标系原点在世界坐标系中的坐标:
x 0=X aw-x w
y 0=Y aw-y w
z 0=0
其中:(x w,y w,0)为第一交点在相机局部世界坐标系中的坐标;(x 0,y 0,z 0)为相机局部世界坐标系原点在世界坐标系中的坐标;(X aw,Y aw,0)为第一交点在世界坐标系中的坐标。
步骤217,基于相机局部世界坐标系原点在世界坐标系中的坐标、相机在相机局部世界坐标系中的坐标,计算相机在世界坐标系中的坐标。
然后再根据下述公式,计算出相机在世界坐标系中的坐标,即相机在世界坐标系中的真实安装位置:
X cam=xcam+x 0
Y cam=ycam+y 0
Z cam=zcam
其中:(x 0,y 0,z 0)为相机局部世界坐标系原点在世界坐标系中的坐标;(xcam,ycam,zcam)为相机在相机局部世界坐标系中的坐标;(X cam,Y cam,Z cam)为相机在世界坐标系中的坐标。
可以根据实际需要,按照图4中对应步骤,计算相机滚转角、相机俯仰角、相机偏航角、相机的安装高度以及相机的位置中的部分相机外参,在此不再赘述。
在本申请实施例提供的图4所示的相机外参标定方法中,基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图;在畸变矫正图中,获取四条直线形成的四个交点的像素坐标;基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。在畸变校正图中获取上述在场景中围成闭合矩形区域的四条直线形成的四个交点的像素坐标,通过预设的坐标变换策略, 可以标定出相机外参,而无需通过迭代处理,计算过程稳定,计算耗时短,提高了相机外参标定的速度和标定结果的准确度。
基于同一发明构思,根据本申请上述实施例提供的一种相机外参标定方法,相应地,本申请实施例提供了一种相机外参标定装置,其结构示意图如图10所示,包括:
畸变校正图获取模块301,用于基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,标定图片原图为通过相机对预先设定有四条直线的场景进行拍摄得到的图片,四条直线在场景中围成的闭合区域为矩形。
像素坐标获取模块302,用于在畸变矫正图中,获取四条直线形成的四个交点的像素坐标,四个交点的像素坐标为四个交点在主点像素坐标系中的坐标。
外参标定模块303,基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
在一种示例中,像素坐标获取模块302,具体用于:
在畸变矫正图中,通过直线拟合的方式,分别获取四条直线在主点像素坐标系中的直线方程;根据四条直线在主点像素坐标系中的直线方程,计算四条直线形成的四个交点的像素坐标。
在一种示例中,相机外参包括相机滚转角;
外参标定模块303,具体用于:基于四个交点的像素坐标,利用矩形区域对边相等性质及主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角。
在一种示例中,外参标定模块303,还用于:对相机滚转角进行修正运算,计算得到四个交点的修正像素坐标,四个交点的修正像素坐标为无相机滚转角的畸变矫正图上四个交点的像素坐标;根据四个交点的修正像素坐标,确定指定消失点的像素坐标,指定消失点为无相机滚转角的畸变矫正图上四个交点组成的四边形的任意一组非平行对边形成的交点;从相机内参中提取相机的纵向焦距,并从指定消失点的像素坐标中提取指定消失点的纵坐标;基于相机的纵向焦距及指定消失点的纵坐标,计算相机俯仰角。
在一种示例中,外参标定模块303,具体用于:
根据相机滚转角,采用修正计算公式,计算得到四个交点的修正像素坐标,修正计算公式为:
x A'=cos s*x A+sin s*y A x B'=cos s*x B+sin s*y B
y A'=-sin s*x A+cos s*y A y B'=-sin s*x B+cos s*y B
x C'=cos s*x C+sin s*y C x D'=cos s*x D+sin s*y D
y C'=-sin s*x C+cos s*y C y D'=-sin s*x D+cos s*y D
其中,(x A,y A)、(x B,y B)、(x C,y C)及(x D,y D)分别为四个交点的像素坐标;(x A',y A')、(x B',y B')、(x C',y C')及(x D',y D')分别为四个交点的修正像素坐标;s为相机滚转角。
在一种示例中,外参标定模块303,还用于:从相机内参中提取相机的横向焦距,并从指定消失点的像素坐标中提取指定消失点的横坐标;基于指定相机的横向焦距、消失点的横坐标及相机俯仰角,计算相机偏航角。
在一种示例中,外参标定模块303,还用于:获取在场景中,四条直线中的两条指定直线间的距离,两条指定直线为在畸变矫正图中存在指定消失点的两条直线;基于四个交点的修正像素坐标,计算主点像素坐标系的横轴与两条指定直线相交后的截距;基于四条直线中的两条指定直线间的距离、相机的横向焦距、相机偏航角、截距及相机俯仰角,计算相机安装高度。
在一种示例中,外参标定模块303,还用于:获取四条直线在场景中生成的四个交点中的第一交点在世界坐标系下的坐标,第一交点为四个交点中的任一交点;基于相机安装高度、相机俯仰角及相机偏航角,计算相机在相机局部世界坐标系中的坐标;基于第一交点在主点像素坐标系中的坐标、相机滚转角、相机俯仰角、相机偏航角、相机的横向焦距、相机的纵向焦距、相机在相机局部世界坐标系中的坐标、相机主点在像素坐标系中的坐标,计算第一交点在相机局部世界坐标系中的坐标;基于第一交点在世界坐标系中的坐标、第一交点在世界坐标系中的坐标,计算相机局部世界坐标系原点在世界坐标系中的坐标;基于相机局部世界坐标系原点在世界坐标系中的坐标、相机在相机局部世界坐标系中的坐标,计算相机在世界坐标系中的坐标。
在发明实施例提供的相机外参标定装置中,畸变校正图获取模块301基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图;像素坐标获取模块302在畸变矫正图中,获取四条直线形成的四个交点的像素坐标;外参标定模块303基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。在畸变校正图中获取上述在场景中围成闭合矩形区域的四条直线形成的四个交点的像素坐标,通过预设的坐标变换策略,可以标定出相机外参,而无需通过迭代处理,计算过程稳定,计算耗时短,提高了相机外参标定的速度和标定结果的准确度。
基于同一发明构思,根据本申请上述实施例提供的相机外参标定方法,相应地,本申请实施例还提供了一种电子设备,如图11所示,包括处理器401和存储器402,其中,
存储器402,用于存放计算机程序。
处理器401,用于执行存储器402上所存放的程序时,实现本申请实施例提供的相机外参标定方法。
例如,可以包括如下步骤:
基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,标定图片原图为通过相机对预先设定有四条直线的场景进行拍摄得到的图片,四条直线在场景中围成的闭合区域为矩形。
在畸变矫正图中,获取四条直线形成的四个交点的像素坐标,四个交点的像素坐标为四个交点在主点像素坐标系中的坐标。
基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
在一种示例中,在畸变矫正图中,获取四条直线形成的四个交点的像素坐标的步骤,可以包括:
在畸变矫正图中,通过直线拟合的方式,分别获取四条直线在主点像素坐标系中的直线方程。
根据四条直线在主点像素坐标系中的直线方程,计算四条直线形成的四个交点的像素坐标。
在一种示例中,相机外参包括相机滚转角。
基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参的步骤,可以包括:
基于四个交点的像素坐标,利用矩形区域对边相等性质及主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角。
在一种示例中,在基于四个交点的像素坐标,利用矩形区域对边相等性质及主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角的步骤之后,还可以包括:
对相机滚转角进行修正运算,计算得到四个交点的修正像素坐标,四个交点的修正像素坐标为无相机滚转角的畸变矫正图上四个交点的像素坐标。
根据四个交点的修正像素坐标,确定指定消失点的像素坐标,指定消失点为无相机滚转角的畸变矫正图上四个交点组成的四边形的任意一组非平行对边形成的交点。
从相机内参中提取相机的纵向焦距,并从指定消失点的像素坐标中提取指定消失点的纵坐标。
基于相机的纵向焦距及指定消失点的纵坐标,计算相机俯仰角。
在一种示例中,对相机滚转角进行修正运算,计算得到四个交点的修正 像素坐标的步骤,可以包括:
根据相机滚转角,采用修正计算公式,计算得到四个交点的修正像素坐标,修正计算公式为:
x A'=cos s*x A+sin s*y A x B'=cos s*x B+sin s*y B
y A'=-sin s*x A+cos s*y A y B'=-sin s*x B+cos s*y B
x C'=cos s*x C+sin s*y C x D'=cos s*x D+sin s*y D
y C'=-sin s*x C+cos s*y C y D'=-sin s*x D+cos s*y D
其中,(x A,y A)、(x B,y B)、(x C,y C)及(x D,y D)分别为四个交点的像素坐标;(x A',y A')、(x B',y B')、(x C',y C')及(x D',y D')分别为四个交点的修正像素坐标;s为相机滚转角。
在一种示例中,在基于相机的纵向焦距及指定消失点的纵坐标,计算相机俯仰角的步骤之后,还包括。
从相机内参中提取相机的横向焦距,并从指定消失点的像素坐标中提取指定消失点的横坐标。
基于相机的横向焦距、指定消失点的横坐标及相机俯仰角,计算相机偏航角。
在一种示例中,在基于相机的横向焦距、指定消失点的横坐标及相机俯仰角,计算相机偏航角的步骤之后,还可以包括:
获取在场景中,四条直线中的两条指定直线间的距离,两条指定直线为对应在畸变矫正图中存在指定消失点的两条直线。
基于四个交点的修正像素坐标,计算主点像素坐标系的横轴与两条指定直线相交后的截距。
基于四条直线中的两条指定直线间的距离、相机的横向焦距、相机偏航角、截距及相机俯仰角,计算相机安装高度。
在一种示例中,在基于四条直线中的两条指定直线间的距离、相机的横向焦距、相机偏航角、截距及相机俯仰角,计算相机安装高度的步骤之后,还可以包括:
获取四条直线在场景中生成的四个交点中的第一交点在世界坐标系下的坐标,第一交点为四个交点中的任一交点。
基于相机安装高度、相机俯仰角及相机偏航角,计算相机在相机局部世界坐标系中的坐标。
基于第一交点在主点像素坐标系中的坐标、相机滚转角、相机俯仰角、相机偏航角、相机的横向焦距、相机的纵向焦距、相机在相机局部世界坐标系中的坐标、相机主点在像素坐标系中的坐标,计算第一交点在相机局部世界坐标系中的坐标。
根据第一交点在相机局部世界坐标系中的坐标、第一交点在世界坐标系中的坐标,计算相机局部世界坐标系原点在世界坐标系中的坐标。
基于相机局部世界坐标系原点在世界坐标系中的坐标、相机在相机局部世界坐标系中的坐标,计算相机在世界坐标系中的坐标。
存储器可以包括随机存取存储器(英文:Random Access Memory,简称:RAM),也可以包括非易失性存储器(英文:Non-volatile Memory,简称:NVM),例如至少一个磁盘存储器。进一步的,存储器还可以是至少一个位于远离前述处理器的存储装置。
上述的处理器可以是通用处理器,包括中央处理器(英文:Central Processing Unit,简称:CPU)、网络处理器(英文:Network Processor,简称:NP)等;还可以是数字信号处理器(英文:Digital Signal Processor,简称:DSP)、专用集成电路(英文:Application Specific Integrated Circuit,简称:ASIC)、现场可编程门阵列(英文:Field-Programmable Gate Array,简称:FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。
上述处理器401和存储器402可以通过地址总线、数据总线、控制总线等通信总线连接,通信总线可以是外设部件互连标准(英文:Peripheral Component Interconnect,简称:PCI)总线或扩展工业标准结构(英文:Extended Industry Standard Architecture,简称:EISA)总线等。电子设备可以通过通信接口与其他外部设备进行通信。
上述处理器401和存储器402之间、电子设备和其他外部设备之间还可以通过无线模块的无线连接方式进行通信。
本申请实施例提供的电子设备中,其采用的方法为:基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,标定图片原图为通过相机对预先设定有四条直线的场景进行拍摄得到的图片,四条直线在场景中围成的闭合区域为矩形;在畸变矫正图中,获取四条直线形成的四个交点的像素坐标,四个交点的像素坐标为四个交点在主点像素坐标系中的坐标;基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。通过上述方法,可以在畸变校正图中获取上述在场景中围成闭合矩形区域的四条直线形成的四个交点的像素坐标,通过预设的坐标变换策略,可以标定出相机外参,而无需通过迭代处理,计算过程稳定,计算耗时短,提高了相机外参标定的速度和标定结果的准确度。
在本申请提供的又一实施例中,还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机执行上述实施例中任一上述的相机外参标定方法。
本申请实施例提供的计算机可读存储介质中,其采用的方法为:基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,标定图片原图为通过相机对预先设定有四条直线的场景进行拍摄得到的图片,四条直线在场景中围成的闭合区域为矩形;在畸变矫正图中,获取四条直线形成的四个交点的像素坐标,四个交点的像素坐标为四个交点在主点像素坐标系中的坐标;基于四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。通过上述方法,可以在畸变校正图中获取上述在场景中围成闭合矩形区域的四条直线形成的四个交点的像素坐标,通过预设的坐标变换策略,可以标定出相机外参,而无需通过迭代处理,计算过程稳定,计算耗时短,提高了相机外参标定的速度和标定结果的准确度。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用相关的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于装置、电子设备以及存储介质实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (25)

  1. 一种相机外参标定方法,其特征在于,所述方法包括:
    基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,所述标定图片原图为通过所述相机对预先设定有四条直线的场景进行拍摄得到的图片,所述四条直线在所述场景中围成的闭合区域为矩形;
    在所述畸变矫正图中,获取所述四条直线形成的四个交点的像素坐标,所述四个交点的像素坐标为所述四个交点在主点像素坐标系中的坐标;
    基于所述四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
  2. 根据权利要求1所述的方法,其特征在于,所述在所述畸变矫正图中,获取所述四条直线形成的四个交点的像素坐标,包括:
    在所述畸变矫正图中,通过直线拟合的方式,分别获取所述四条直线在主点像素坐标系中的直线方程;
    根据所述四条直线在主点像素坐标系中的直线方程,计算所述四条直线形成的四个交点的像素坐标。
  3. 根据权利要求1所述的方法,其特征在于,所述相机外参包括相机滚转角;
    所述基于所述四个交点的像素坐标,通过预设坐标变换策略,标定相机外参,包括:
    基于所述四个交点的像素坐标,利用矩形区域对边相等性质及所述主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角。
  4. 根据权利要求3所述的方法,其特征在于,在所述基于所述四个交点的像素坐标,利用矩形区域对边相等性质及所述主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角之后,所述方法还包括:
    对所述相机滚转角进行修正运算,计算得到所述四个交点的修正像素坐标,所述四个交点的修正像素坐标为无相机滚转角的畸变矫正图上所述四个交点的像素坐标;
    根据所述四个交点的修正像素坐标,确定指定消失点的像素坐标,所述指定消失点为无相机滚转角的畸变矫正图上所述四个交点组成的四边形的任意一组非平行对边形成的交点;
    从所述相机内参中提取所述相机的纵向焦距,并从所述指定消失点的像素坐标中提取所述指定消失点的纵坐标;
    基于所述相机的纵向焦距及所述指定消失点的纵坐标,计算相机俯仰角。
  5. 根据权利要求4所述的方法,其特征在于,所述对所述相机滚转角进行修正运算,计算得到所述四个交点的修正像素坐标,包括:
    根据所述相机滚转角,采用修正计算公式,计算得到所述四个交点的修正像素坐标,所述修正计算公式为:
    x A'=cos s*x A+sin s*y A x B'=cos s*x B+sin s*y B
    y A'=-sin s*x A+cos s*y A y B'=-sin s*x B+cos s*y B
    x C'=cos s*x C+sin s*y C x D'=cos s*x D+sin s*y D
    y C'=-sin s*x C+cos s*y C y D'=-sin s*x D+cos s*y D
    其中,(x A,y A)、(x B,y B)、(x C,y C)及(x D,y D)分别为所述四个交点的像素坐标;(x A',y A')、(x B',y B')、(x C',y C')及(x D',y D')分别为所述四个交点的修正像素坐标;所述s为所述相机滚转角。
  6. 根据权利要求4所述的方法,其特征在于,在所述基于所述相机的纵向焦距及所述指定消失点的纵坐标,计算相机俯仰角之后,所述方法还包括:
    从所述相机内参中提取所述相机的横向焦距,并从所述指定消失点的像素坐标中提取所述指定消失点的横坐标;
    基于所述相机的横向焦距、所述指定消失点的横坐标及所述相机俯仰角,计算相机偏航角。
  7. 根据权利要求6所述的方法,其特征在于,在所述基于所述相机的横向焦距、所述指定消失点的横坐标及所述相机俯仰角,计算相机偏航角之后,所述方法还包括:
    获取在所述场景中,所述四条直线中的两条指定直线间的距离,所述两条指定直线为对应在所述畸变矫正图中存在所述指定消失点的两条直线;
    基于所述四个交点的修正像素坐标,计算所述主点像素坐标系的横轴与所述两条指定直线相交后的截距;
    基于所述四条直线中的两条指定直线间的距离、所述相机的横向焦距、所述相机偏航角、所述截距及所述相机俯仰角,计算相机安装高度。
  8. 根据权利要求7所述的方法,其特征在于,在所述基于所述四条直线中的两条指定直线间的距离、所述相机的横向焦距、所述相机偏航角、所述截距及所述相机俯仰角,计算相机安装高度之后,所述方法还包括:
    获取所述四条直线在所述场景中生成的四个交点中的第一交点在世界坐标系下的坐标,所述第一交点为所述四个交点中的任一交点;
    基于所述相机安装高度、所述相机俯仰角及所述相机偏航角,计算所述相机在相机局部世界坐标系中的坐标;
    基于所述第一交点在主点像素坐标系中的坐标、所述相机滚转角、所述 相机俯仰角、所述相机偏航角、所述相机的横向焦距、所述相机的纵向焦距、所述相机在相机局部世界坐标系中的坐标、所述相机主点在像素坐标系中的坐标,计算所述第一交点在相机局部世界坐标系中的坐标;
    根据所述第一交点在相机局部世界坐标系中的坐标、所述第一交点在世界坐标系中的坐标,计算所述相机局部世界坐标系原点在世界坐标系中的坐标;
    基于所述相机局部世界坐标系原点在世界坐标系中的坐标、所述相机在相机局部世界坐标系中的坐标,计算相机在世界坐标系中的坐标。
  9. 一种相机外参标定装置,其特征在于,所述装置包括:
    畸变校正图获取模块,用于基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,所述标定图片原图为通过所述相机对预先设定有四条直线的场景进行拍摄得到的图片,所述四条直线在所述场景中围成的闭合区域为矩形;
    像素坐标获取模块,用于在所述畸变矫正图中,获取所述四条直线形成的四个交点的像素坐标,所述四个交点的像素坐标为所述四个交点在主点像素坐标系中的坐标;
    外参标定模块,基于所述四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
  10. 根据权利要求9所述的装置,其特征在于,所述像素坐标获取模块,具体用于:
    在所述畸变矫正图中,通过直线拟合的方式,分别获取所述四条直线在主点像素坐标系中的直线方程;
    根据所述四条直线在主点像素坐标系中的直线方程,计算所述四条直线形成的四个交点的像素坐标。
  11. 根据权利要求9所述的装置,其特征在于,所述相机外参包括相机滚转角;
    所述外参标定模块,具体用于:
    基于所述四个交点的像素坐标,利用矩形区域对边相等性质及所述主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角。
  12. 根据权利要求11所述的装置,其特征在于,所述外参标定模块还用于:
    对所述相机滚转角进行修正运算,计算得到所述四个交点的修正像素坐标,所述四个交点的修正像素坐标为无相机滚转角的畸变矫正图上所述四个交点的像素坐标;
    根据所述四个交点的修正像素坐标,确定指定消失点的像素坐标,所述指定消失点为无相机滚转角的畸变矫正图上所述四个交点组成的四边形的任意一组非平行对边形成的交点;
    从所述相机内参中提取所述相机的纵向焦距,并从所述指定消失点的像素坐标中提取所述指定消失点的纵坐标;
    基于所述相机的纵向焦距及所述指定消失点的纵坐标,计算相机俯仰角。
  13. 根据权利要求12所述的装置,其特征在于,所述外参标定模块具体用于:
    根据所述相机滚转角,采用修正计算公式,计算得到所述四个交点的修正像素坐标,所述修正计算公式为:
    x A'=cos s*x A+sin s*y A x B'=cos s*x B+sin s*y B
    y A'=-sin s*x A+cos s*y A y B'=-sin s*x B+cos s*y B
    x C'=cos s*x C+sin s*y C x D'=cos s*x D+sin s*y D
    y C'=-sin s*x C+cos s*y C y D'=-sin s*x D+cos s*y D
    其中,(x A,y A)、(x B,y B)、(x C,y C)及(x D,y D)分别为所述四个交点的像素坐标;(x A',y A')、(x B',y B')、(x C',y C')及(x D',y D')分别为所述四个交点的修正像素坐标;所述s为所述相机滚转角。
  14. 根据权利要求12所述的装置,其特征在于,所述外参标定模块还用于:
    从所述相机内参中提取所述相机的横向焦距,并从所述指定消失点的像素坐标中提取所述指定消失点的横坐标;
    基于所述相机的横向焦距、指定消失点的横坐标及所述相机俯仰角,计算相机偏航角。
  15. 根据权利要求14所述的装置,其特征在于,所述外参标定模块还用于:
    获取在所述场景中,所述四条直线中的两条指定直线间的距离,所述两条指定直线为对应在所述畸变矫正图中存在所述指定消失点的两条直线;
    基于所述四个交点的修正像素坐标,计算所述主点像素坐标系的横轴与所述两条指定直线相交后的截距;
    基于所述四条直线中的两条指定直线间的距离、所述相机的横向焦距、所述相机偏航角、所述截距及所述相机俯仰角,计算相机安装高度。
  16. 根据权利要求9所述的装置,其特征在于,所述外参标定模块还用于:
    获取所述四条直线在所述场景中生成的四个交点中的第一交点在世界坐标系下的坐标,所述第一交点为所述四个交点中的任一交点;
    基于所述相机安装高度、所述相机俯仰角及所述相机偏航角,计算所述相机在相机局部世界坐标系中的坐标;
    基于所述第一交点在主点像素坐标系中的坐标、所述相机滚转角、所述相机俯仰角、所述相机偏航角、所述相机的横向焦距、所述相机的纵向焦距、所述相机在相机局部世界坐标系中的坐标、所述相机主点在像素坐标系中的坐标,计算所述第一交点在相机局部世界坐标系中的坐标;
    根据所述第一交点在相机局部世界坐标系中的坐标、所述第一交点在世界坐标系中的坐标,计算所述相机局部世界坐标系原点在世界坐标系中的坐标;
    基于所述相机局部世界坐标系原点在世界坐标系中的坐标、所述相机在相机局部世界坐标系中的坐标,计算相机在世界坐标系中的坐标。
  17. 一种电子设备,其特征在于,包括处理器和存储器,其中,
    所述存储器,用于存放计算机程序;
    所述处理器,用于执行所述存储器上所存放的程序时,实现如下步骤:
    基于相机内参,对标定图片原图进行畸变矫正,得到畸变矫正图,所述标定图片原图为通过所述相机对预先设定有四条直线的场景进行拍摄得到的图片,所述四条直线在所述场景中围成的闭合区域为矩形;
    在所述畸变矫正图中,获取所述四条直线形成的四个交点的像素坐标,所述四个交点的像素坐标为所述四个交点在主点像素坐标系中的坐标;
    基于所述四个交点的像素坐标,通过预设坐标变换策略,标定相机外参。
  18. 根据权利要求17所述的电子设备,其特征在于,所述在所述畸变矫正图中,获取所述四条直线形成的四个交点的像素坐标的步骤,包括:
    在所述畸变矫正图中,通过直线拟合的方式,分别获取所述四条直线在主点像素坐标系中的直线方程;
    根据所述四条直线在主点像素坐标系中的直线方程,计算所述四条直线形成的四个交点的像素坐标。
  19. 根据权利要求17所述的电子设备,其特征在于,所述相机外参包括相机滚转角;
    所述基于所述四个交点的像素坐标,通过预设坐标变换策略,标定相机外参的步骤,包括:
    基于所述四个交点的像素坐标,利用矩形区域对边相等性质及所述主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角。
  20. 根据权利要求19所述的电子设备,其特征在于,在所述基于所述四个交点的像素坐标,利用矩形区域对边相等性质及所述主点像素坐标系与相机局部世界坐标系之间的预设映射关系,计算相机滚转角的步骤之后,还包括:
    对所述相机滚转角进行修正运算,计算得到所述四个交点的修正像素坐标,所述四个交点的修正像素坐标为无相机滚转角的畸变矫正图上所述四个交点的像素坐标;
    根据所述四个交点的修正像素坐标,确定指定消失点的像素坐标,所述指定消失点为无相机滚转角的畸变矫正图上所述四个交点组成的四边形的任意一组非平行对边形成的交点;
    从所述相机内参中提取所述相机的纵向焦距,并从所述指定消失点的像素坐标中提取所述指定消失点的纵坐标;
    基于所述相机的纵向焦距及所述指定消失点的纵坐标,计算相机俯仰角。
  21. 根据权利要求20所述的电子设备,其特征在于,所述对所述相机滚转角进行修正运算,计算得到所述四个交点的修正像素坐标的步骤,包括:
    根据所述相机滚转角,采用修正计算公式,计算得到所述四个交点的修正像素坐标,所述修正计算公式为:
    x A'=cos s*x A+sin s*y A x B'=cos s*x B+sin s*y B
    y A'=-sin s*x A+cos s*y A y B'=-sin s*x B+cos s*y B
    x C'=cos s*x C+sin s*y C x D'=cos s*x D+sin s*y D
    y C'=-sin s*x C+cos s*y C y D'=-sin s*x D+cos s*y D
    其中,(x A,y A)、(x B,y B)、(x C,y C)及(x D,y D)分别为所述四个交点的像素坐标;(x A',y A')、(x B',y B')、(x C',y C')及(x D',y D')分别为所述四个交点的修正像素坐标;所述s为所述相机滚转角。
  22. 根据权利要求20所述的电子设备,其特征在于,在所述基于所述相机的纵向焦距及所述指定消失点的纵坐标,计算相机俯仰角的步骤之后,还包括:
    从所述相机内参中提取所述相机的横向焦距,并从所述指定消失点的像素坐标中提取所述指定消失点的横坐标;
    基于所述相机的横向焦距、所述指定消失点的横坐标及所述相机俯仰角,计算相机偏航角。
  23. 根据权利要求22所述的电子设备,其特征在于,在所述基于所述相 机的横向焦距、所述指定消失点的横坐标及所述相机俯仰角,计算相机偏航角的步骤之后,还包括:
    获取在所述场景中,所述四条直线中的两条指定直线间的距离,所述两条指定直线为对应在所述畸变矫正图中存在所述指定消失点的两条直线;
    基于所述四个交点的修正像素坐标,计算所述主点像素坐标系的横轴与所述两条指定直线相交后的截距;
    基于所述四条直线中的两条指定直线间的距离、所述相机的横向焦距、所述相机偏航角、所述截距及所述相机俯仰角,计算相机安装高度。
  24. 根据权利要求23所述的电子设备,其特征在于,在所述基于所述四条直线中的两条指定直线间的距离、所述相机的横向焦距、所述相机偏航角、所述截距及所述相机俯仰角,计算相机安装高度的步骤之后,还包括:
    获取所述四条直线在所述场景中生成的四个交点中的第一交点在世界坐标系下的坐标,所述第一交点为所述四个交点中的任一交点;
    基于所述相机安装高度、所述相机俯仰角及所述相机偏航角,计算所述相机在相机局部世界坐标系中的坐标;
    基于所述第一交点在主点像素坐标系中的坐标、所述相机滚转角、所述相机俯仰角、所述相机偏航角、所述相机的横向焦距、所述相机的纵向焦距、所述相机在相机局部世界坐标系中的坐标、所述相机主点在像素坐标系中的坐标,计算所述第一交点在相机局部世界坐标系中的坐标;
    根据所述第一交点在相机局部世界坐标系中的坐标、所述第一交点在世界坐标系中的坐标,计算所述相机局部世界坐标系原点在世界坐标系中的坐标;
    基于所述相机局部世界坐标系原点在世界坐标系中的坐标、所述相机在相机局部世界坐标系中的坐标,计算相机在世界坐标系中的坐标。
  25. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1-8任一所述的方法步骤。
PCT/CN2019/079569 2018-03-30 2019-03-25 一种相机外参标定方法、装置及电子设备 WO2019184885A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810298452.0A CN110322513B (zh) 2018-03-30 2018-03-30 一种相机外参标定方法、装置及电子设备
CN201810298452.0 2018-03-30

Publications (1)

Publication Number Publication Date
WO2019184885A1 true WO2019184885A1 (zh) 2019-10-03

Family

ID=68062488

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/079569 WO2019184885A1 (zh) 2018-03-30 2019-03-25 一种相机外参标定方法、装置及电子设备

Country Status (2)

Country Link
CN (1) CN110322513B (zh)
WO (1) WO2019184885A1 (zh)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110910410A (zh) * 2019-11-07 2020-03-24 河海大学 一种基于计算机视觉的球场定位系统及方法
CN111028296A (zh) * 2019-11-07 2020-04-17 浙江大华技术股份有限公司 球机焦距值估算方法、装置、设备及存储装置
CN111260736A (zh) * 2020-01-16 2020-06-09 中国科学院西安光学精密机械研究所 一种空间相机内参在轨实时标定方法
CN111340890A (zh) * 2020-02-20 2020-06-26 北京百度网讯科技有限公司 相机外参标定方法、装置、设备和可读存储介质
CN111739104A (zh) * 2020-06-24 2020-10-02 深圳市道通科技股份有限公司 一种激光标定系统的标定方法、装置以及激光标定系统
CN111862231A (zh) * 2020-06-15 2020-10-30 南方科技大学 一种相机标定方法、车道偏离预警方法及系统
CN112102416A (zh) * 2020-08-28 2020-12-18 中国科学院深圳先进技术研究院 用于多个相机的自动标定方法及系统
CN112150560A (zh) * 2020-09-27 2020-12-29 上海高德威智能交通系统有限公司 确定消失点的方法、装置及计算机存储介质
CN112738487A (zh) * 2020-12-24 2021-04-30 北京百度网讯科技有限公司 图像投射方法、装置、设备及存储介质
CN112991465A (zh) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 相机标定方法、装置、电子设备和计算机可读介质
CN113240752A (zh) * 2021-05-21 2021-08-10 中科创达软件股份有限公司 一种内参和外参协同标定方法和装置
CN113269824A (zh) * 2021-05-28 2021-08-17 陕西工业职业技术学院 一种基于图像的距离确定方法及系统
CN113376617A (zh) * 2020-02-25 2021-09-10 北京京东乾石科技有限公司 雷达标定结果准确性的评价方法、装置、存储介质及系统
CN113465573A (zh) * 2021-06-30 2021-10-01 深圳市优必选科技股份有限公司 单目测距方法、装置及智能装置
CN113516718A (zh) * 2020-04-10 2021-10-19 富华科精密工业(深圳)有限公司 批量摄像装置外参标定方法及电子设备
CN113516717A (zh) * 2020-04-10 2021-10-19 富华科精密工业(深圳)有限公司 摄像装置外参标定方法、电子设备及存储介质
CN113610929A (zh) * 2021-08-09 2021-11-05 西安外事学院 一种相机与多线激光的联合标定方法
CN113781575A (zh) * 2021-08-09 2021-12-10 上海奥视达智能科技有限公司 一种相机参数的标定方法、装置、终端和存储介质
CN113838138A (zh) * 2021-08-06 2021-12-24 杭州灵西机器人智能科技有限公司 一种优化特征提取的系统标定方法、系统、装置和介质
CN113870163A (zh) * 2021-09-24 2021-12-31 埃洛克航空科技(北京)有限公司 基于三维场景的视频融合方法以及装置、存储介质、电子装置
CN114018932A (zh) * 2021-11-02 2022-02-08 西安电子科技大学 基于矩形标定物的路面病害指标测量方法
CN114219850A (zh) * 2021-11-16 2022-03-22 英博超算(南京)科技有限公司 一种应用360全景环视技术的车辆测距系统
CN114419165A (zh) * 2022-01-17 2022-04-29 北京百度网讯科技有限公司 相机外参校正方法、装置、电子设备和存储介质
CN114830911A (zh) * 2022-05-19 2022-08-02 苏州大学 智能除草方法、装置和存储介质
CN114926371A (zh) * 2022-06-27 2022-08-19 北京五八信息技术有限公司 一种全景图的垂直校正、灭点检测方法、设备及存储介质
CN115031696A (zh) * 2022-05-23 2022-09-09 深圳大学 基于倾斜成像结构的双目成像测量方法及系统
CN115775282A (zh) * 2023-01-29 2023-03-10 广州市易鸿智能装备有限公司 一种高速在线矫正图像畸变的方法、装置及存储介质
WO2023216982A1 (zh) * 2022-05-10 2023-11-16 腾讯科技(深圳)有限公司 数据处理方法、装置、计算机设备、存储介质及程序产品
CN111739104B (zh) * 2020-06-24 2024-05-03 深圳市道通科技股份有限公司 一种激光标定系统的标定方法、装置以及激光标定系统

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862895B (zh) * 2019-11-27 2023-10-10 杭州海康威视数字技术股份有限公司 一种鱼眼摄像头标定方法、装置及系统
CN111652945A (zh) * 2020-06-03 2020-09-11 北京方程奇迹科技有限公司 一种相机标定方法
CN111751007B (zh) * 2020-06-24 2021-11-26 杭州海康消防科技有限公司 热成像测温方法、装置及存储介质
CN112070846A (zh) * 2020-09-14 2020-12-11 北京华严互娱科技有限公司 基于视频的实时人体动作跟踪方法和系统
CN112150559A (zh) * 2020-09-24 2020-12-29 深圳佑驾创新科技有限公司 图像采集装置的标定方法、计算机设备及存储介质
CN112614045B (zh) * 2020-12-16 2022-05-31 上海交通大学 农机前方作业环境视觉感知透视效应的消除方法和系统
CN112967344B (zh) * 2021-03-09 2023-12-08 阿波罗智联(北京)科技有限公司 相机外参标定的方法、设备、存储介质及程序产品
CN113643358B (zh) * 2021-08-10 2023-07-07 追觅创新科技(苏州)有限公司 相机的外参标定方法、装置、存储介质及系统
CN115760620B (zh) * 2022-11-18 2023-10-20 荣耀终端有限公司 一种文档矫正方法、装置及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102013099A (zh) * 2010-11-26 2011-04-13 中国人民解放军国防科学技术大学 车载摄像机外参数交互式标定方法
WO2017138216A1 (ja) * 2016-02-10 2017-08-17 クラリオン株式会社 キャリブレーションシステム、キャリブレーション装置
CN107292927A (zh) * 2017-06-13 2017-10-24 厦门大学 一种基于双目视觉的对称运动平台位姿测量方法
CN107767422A (zh) * 2017-09-18 2018-03-06 深圳开阳电子股份有限公司 一种鱼眼镜头的校正方法、装置及便携式终端

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2465792A (en) * 2008-11-28 2010-06-02 Sony Corp Illumination Direction Estimation using Reference Object
CN102136140B (zh) * 2010-12-24 2012-07-18 东南大学 一种基于矩形图样的视频图像距离检测方法
KR102449438B1 (ko) * 2015-09-09 2022-09-30 한국전자통신연구원 입방체 복원 장치 및 방법
CN106558080B (zh) * 2016-11-14 2020-04-24 天津津航技术物理研究所 一种单目相机外参在线标定方法
CN106875448B (zh) * 2017-02-16 2019-07-23 武汉极目智能技术有限公司 一种车载单目摄像头外部参数自标定方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102013099A (zh) * 2010-11-26 2011-04-13 中国人民解放军国防科学技术大学 车载摄像机外参数交互式标定方法
WO2017138216A1 (ja) * 2016-02-10 2017-08-17 クラリオン株式会社 キャリブレーションシステム、キャリブレーション装置
CN107292927A (zh) * 2017-06-13 2017-10-24 厦门大学 一种基于双目视觉的对称运动平台位姿测量方法
CN107767422A (zh) * 2017-09-18 2018-03-06 深圳开阳电子股份有限公司 一种鱼眼镜头的校正方法、装置及便携式终端

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110910410A (zh) * 2019-11-07 2020-03-24 河海大学 一种基于计算机视觉的球场定位系统及方法
CN111028296A (zh) * 2019-11-07 2020-04-17 浙江大华技术股份有限公司 球机焦距值估算方法、装置、设备及存储装置
CN111028296B (zh) * 2019-11-07 2023-05-12 浙江大华技术股份有限公司 球机焦距值估算方法、装置、设备及存储装置
CN111260736A (zh) * 2020-01-16 2020-06-09 中国科学院西安光学精密机械研究所 一种空间相机内参在轨实时标定方法
CN111260736B (zh) * 2020-01-16 2023-04-11 中国科学院西安光学精密机械研究所 一种空间相机内参在轨实时标定方法
CN111340890A (zh) * 2020-02-20 2020-06-26 北京百度网讯科技有限公司 相机外参标定方法、装置、设备和可读存储介质
CN111340890B (zh) * 2020-02-20 2023-08-04 阿波罗智联(北京)科技有限公司 相机外参标定方法、装置、设备和可读存储介质
CN113376617B (zh) * 2020-02-25 2024-04-05 北京京东乾石科技有限公司 雷达标定结果准确性的评价方法、装置、存储介质及系统
CN113376617A (zh) * 2020-02-25 2021-09-10 北京京东乾石科技有限公司 雷达标定结果准确性的评价方法、装置、存储介质及系统
CN113516717A (zh) * 2020-04-10 2021-10-19 富华科精密工业(深圳)有限公司 摄像装置外参标定方法、电子设备及存储介质
CN113516718A (zh) * 2020-04-10 2021-10-19 富华科精密工业(深圳)有限公司 批量摄像装置外参标定方法及电子设备
CN111862231B (zh) * 2020-06-15 2024-04-12 南方科技大学 一种相机标定方法、车道偏离预警方法及系统
CN111862231A (zh) * 2020-06-15 2020-10-30 南方科技大学 一种相机标定方法、车道偏离预警方法及系统
CN111739104B (zh) * 2020-06-24 2024-05-03 深圳市道通科技股份有限公司 一种激光标定系统的标定方法、装置以及激光标定系统
CN111739104A (zh) * 2020-06-24 2020-10-02 深圳市道通科技股份有限公司 一种激光标定系统的标定方法、装置以及激光标定系统
CN112102416A (zh) * 2020-08-28 2020-12-18 中国科学院深圳先进技术研究院 用于多个相机的自动标定方法及系统
CN112150560A (zh) * 2020-09-27 2020-12-29 上海高德威智能交通系统有限公司 确定消失点的方法、装置及计算机存储介质
CN112150560B (zh) * 2020-09-27 2024-02-02 上海高德威智能交通系统有限公司 确定消失点的方法、装置及计算机存储介质
CN112738487A (zh) * 2020-12-24 2021-04-30 北京百度网讯科技有限公司 图像投射方法、装置、设备及存储介质
US11715238B2 (en) 2020-12-24 2023-08-01 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Image projection method, apparatus, device and storage medium
CN112991465A (zh) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 相机标定方法、装置、电子设备和计算机可读介质
CN113240752A (zh) * 2021-05-21 2021-08-10 中科创达软件股份有限公司 一种内参和外参协同标定方法和装置
CN113240752B (zh) * 2021-05-21 2024-03-22 中科创达软件股份有限公司 一种内参和外参协同标定方法和装置
CN113269824B (zh) * 2021-05-28 2023-07-07 陕西工业职业技术学院 一种基于图像的距离确定方法及系统
CN113269824A (zh) * 2021-05-28 2021-08-17 陕西工业职业技术学院 一种基于图像的距离确定方法及系统
CN113465573A (zh) * 2021-06-30 2021-10-01 深圳市优必选科技股份有限公司 单目测距方法、装置及智能装置
CN113838138A (zh) * 2021-08-06 2021-12-24 杭州灵西机器人智能科技有限公司 一种优化特征提取的系统标定方法、系统、装置和介质
CN113610929A (zh) * 2021-08-09 2021-11-05 西安外事学院 一种相机与多线激光的联合标定方法
CN113781575A (zh) * 2021-08-09 2021-12-10 上海奥视达智能科技有限公司 一种相机参数的标定方法、装置、终端和存储介质
CN113781575B (zh) * 2021-08-09 2024-01-12 上海奥视达智能科技有限公司 一种相机参数的标定方法、装置、终端和存储介质
CN113610929B (zh) * 2021-08-09 2023-08-18 西安外事学院 一种相机与多线激光的联合标定方法
CN113870163A (zh) * 2021-09-24 2021-12-31 埃洛克航空科技(北京)有限公司 基于三维场景的视频融合方法以及装置、存储介质、电子装置
CN113870163B (zh) * 2021-09-24 2022-11-29 埃洛克航空科技(北京)有限公司 基于三维场景的视频融合方法以及装置、存储介质、电子装置
CN114018932A (zh) * 2021-11-02 2022-02-08 西安电子科技大学 基于矩形标定物的路面病害指标测量方法
CN114219850A (zh) * 2021-11-16 2022-03-22 英博超算(南京)科技有限公司 一种应用360全景环视技术的车辆测距系统
CN114419165A (zh) * 2022-01-17 2022-04-29 北京百度网讯科技有限公司 相机外参校正方法、装置、电子设备和存储介质
CN114419165B (zh) * 2022-01-17 2024-01-12 北京百度网讯科技有限公司 相机外参校正方法、装置、电子设备和存储介质
WO2023216982A1 (zh) * 2022-05-10 2023-11-16 腾讯科技(深圳)有限公司 数据处理方法、装置、计算机设备、存储介质及程序产品
CN114830911A (zh) * 2022-05-19 2022-08-02 苏州大学 智能除草方法、装置和存储介质
CN115031696A (zh) * 2022-05-23 2022-09-09 深圳大学 基于倾斜成像结构的双目成像测量方法及系统
CN114926371B (zh) * 2022-06-27 2023-04-07 北京五八信息技术有限公司 一种全景图的垂直校正、灭点检测方法、设备及存储介质
CN114926371A (zh) * 2022-06-27 2022-08-19 北京五八信息技术有限公司 一种全景图的垂直校正、灭点检测方法、设备及存储介质
CN115775282A (zh) * 2023-01-29 2023-03-10 广州市易鸿智能装备有限公司 一种高速在线矫正图像畸变的方法、装置及存储介质

Also Published As

Publication number Publication date
CN110322513B (zh) 2022-03-04
CN110322513A (zh) 2019-10-11

Similar Documents

Publication Publication Date Title
WO2019184885A1 (zh) 一种相机外参标定方法、装置及电子设备
CN107633536B (zh) 一种基于二维平面模板的相机标定方法及系统
CN107767422B (zh) 一种鱼眼镜头的校正方法、装置及便携式终端
CN106558080B (zh) 一种单目相机外参在线标定方法
WO2019192358A1 (zh) 一种全景视频合成方法、装置及电子设备
WO2021004416A1 (zh) 一种基于视觉信标建立信标地图的方法、装置
WO2021208486A1 (zh) 一种相机坐标变换方法、终端以及存储介质
CN107871329B (zh) 一种相机光学中心的快速标定方法及装置
CN106570907B (zh) 一种相机标定方法及装置
CN110779491A (zh) 一种水平面上目标测距的方法、装置、设备及存储介质
WO2019232793A1 (zh) 双摄像头标定方法、电子设备、计算机可读存储介质
CN116433737A (zh) 一种激光雷达点云与图像配准的方法、装置及智能终端
CN112365421A (zh) 图像矫正处理方法及装置
CN111462245A (zh) 一种基于矩形结构的变焦相机姿态标定方法和系统
TW202147820A (zh) 車載鏡頭的自動校正方法以及車載鏡頭裝置
WO2019205986A1 (zh) 一种图像存储方法、装置、电子设备及存储介质
CN112985360B (zh) 基于车道线的双目测距校正方法、装置、设备和存储介质
CN113379845A (zh) 一种相机标定方法及装置、电子设备及存储介质
CN109902695B (zh) 一种面向像对直线特征匹配的线特征矫正与提纯方法
CN110503604B (zh) 一种基于高精度pos的航空面阵影像实时正射拼接方法
CN108520541B (zh) 一种广角摄像机的标定方法
WO2019170066A1 (zh) 一种车载相机外参确定方法、装置、设备及系统
CN112614194B (zh) 一种图像采集设备的数据处理方法、系统及装置
CN113592934B (zh) 一种基于单目相机的目标深度与高度测量方法及装置
KR101703715B1 (ko) 카메라 광중심 측정을 위한 영상 처리 장치 및 그 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19777396

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19777396

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19777396

Country of ref document: EP

Kind code of ref document: A1