WO2021238923A1 - 相机参数标定方法及装置 - Google Patents

相机参数标定方法及装置 Download PDF

Info

Publication number
WO2021238923A1
WO2021238923A1 PCT/CN2021/095821 CN2021095821W WO2021238923A1 WO 2021238923 A1 WO2021238923 A1 WO 2021238923A1 CN 2021095821 W CN2021095821 W CN 2021095821W WO 2021238923 A1 WO2021238923 A1 WO 2021238923A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
camera
feature points
parameters
image
Prior art date
Application number
PCT/CN2021/095821
Other languages
English (en)
French (fr)
Inventor
许峰
杨盛
常新伟
齐焱
吴立斌
Original Assignee
追觅创新科技(苏州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202020890263.5U external-priority patent/CN212433821U/zh
Priority claimed from CN202010447821.5A external-priority patent/CN111612853B/zh
Application filed by 追觅创新科技(苏州)有限公司 filed Critical 追觅创新科技(苏州)有限公司
Publication of WO2021238923A1 publication Critical patent/WO2021238923A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • This application relates to the field of computer vision technology, and in particular to a method and device for calibration of camera parameters.
  • This application provides a method and device for calibration of camera parameters to improve the efficiency and accuracy of calibration of camera parameters.
  • the embodiment of the application provides a method for calibrating camera parameters, including:
  • each image of calibration part contains images of N calibration parts
  • the internal and external parameters of the camera are calculated.
  • setting the camera at a specific position so that the camera and the N calibration parts meet the specific positional relationship includes:
  • All the N calibration parts are located in the field of view of the camera, and the distance between the optical center of the camera and a specific calibration part among the N calibration parts is a predetermined distance, and the specific calibration part is the calibration part directly facing the camera.
  • the method is applied to an automatic mobile device that includes a camera, and correspondingly, all the N calibration parts are located within the field of view of the camera, and the distance between the optical center of the camera and a specific calibration part of the N calibration parts is The predetermined distance includes:
  • the internal and external parameters of the camera calculated include:
  • the internal and external parameters of the camera are calculated according to the coordinates of the image feature points and the coordinates of the calibration piece feature points.
  • the internal and external parameters of the camera obtained by calculation include:
  • the internal parameters of the camera are calculated;
  • the external parameters of the camera are calculated according to the coordinates of the image feature points of the specific calibration part and the coordinates of the calibration part feature points of the specific calibration part.
  • the specific calibration part is the calibration part facing the camera.
  • the coordinates of the image feature points are sub-pixel precision two-dimensional coordinates of the image feature points, and the sub-pixel precision two-dimensional coordinates are calculated in the following manner:
  • the original two-dimensional coordinates of the image feature points are calculated iteratively to obtain the sub-pixel precision two-dimensional coordinates of the image feature points.
  • the calculated internal parameters of the camera include:
  • the calculated external parameters of the camera include:
  • the corresponding relationship between the two-dimensional coordinates and the three-dimensional coordinates is constructed based on the two-dimensional coordinates of the image feature points of the corresponding specific calibration parts and the three-dimensional coordinates of the feature points of the calibration parts;
  • the parameters of the corresponding objective function are calculated iteratively until the reprojection error is less than the preset threshold, and the external parameters are obtained.
  • the method further includes:
  • controlling the camera to obtain the calibration piece includes:
  • the calibration piece includes a calibration board with a checkerboard, the planes on which the checkers on the N calibration boards are intersected, correspondingly, the image feature points are the checkerboard corner points of the calibration board in the image, and the calibration piece feature points Including the corners of the checkerboard of the calibration board.
  • An embodiment of the application also provides a camera parameter calibration device, which includes:
  • Calibration module including N calibration parts, N ⁇ 3;
  • the camera positioning unit is used to position the camera at a specific position so that the camera and the N calibration parts meet the specific positional relationship;
  • the control unit is used to control the camera to obtain the image of the calibration part, and each image of the calibration part contains the images of N calibration parts;
  • the processing unit is configured to calculate the internal and external parameters of the camera based on the position data of the image feature points in the calibration piece image and the position data of the calibration piece feature points of the calibration piece.
  • the camera positioning unit is configured to:
  • All the N calibration parts are located in the field of view of the camera, and the distance between the optical center of the camera and a specific calibration part among the N calibration parts is a predetermined distance, and the specific calibration part is the calibration part directly facing the camera.
  • the device is applied to an automatic mobile device including a camera, and correspondingly, the camera positioning unit includes:
  • the positioning board is used to place the automatic moving equipment so that all the N calibration parts are in the field of view of the camera;
  • the detection sensor is used to provide detection information that reflects whether the automatic mobile device is placed horizontally.
  • it also includes:
  • the detection unit is used to detect whether the image feature points in the calibration part image meet the preset conditions
  • the processing unit calculates the internal and external parameters of the camera according to the coordinates of the image feature points and the coordinates of the calibration piece feature points.
  • the processing unit is configured to: according to the coordinates of the image feature points of the N-1 calibration parts except the specific calibration part in the calibration part image, and the coordinates of the calibration part feature points of the N-1 calibration parts , Calculate the internal parameters of the camera; and based on the calculated internal parameters, according to the coordinates of the image feature points of the specific calibration part and the coordinates of the calibration part feature points of the specific calibration part, the external parameters of the camera are calculated; the specific calibration part is The calibration piece facing the camera.
  • one image includes images of multiple calibration parts, which can not only effectively improve the calibration efficiency, but also ensure the stability of the image acquisition quality, and can achieve batch calibration.
  • the pinhole camera model to realize the 2D-3D coordinate conversion of the image feature points, it means that the three-dimensional spatial feature points are mapped to the two-dimensional imaging plane, and the laser sensor is used to obtain the feature points of the calibration parts (checkerboard corner points)
  • the distance to the optical center of the camera provides a high-precision focal length data for the pinhole camera model.
  • FIG. 1 shows a flowchart of the camera parameter calibration method in this application
  • Figure 2 shows a model diagram of a pinhole camera
  • Figure 3 shows a schematic diagram of the reprojection error
  • Fig. 4 shows a structural diagram of the camera parameter calibration device in this application
  • Fig. 5 shows a structural diagram of the camera parameter calibration device in the present application with a shell removed
  • Fig. 6 shows an external schematic diagram of the camera parameter calibration device in this application
  • Figure 7 shows the bottom structure diagram of the camera parameter calibration equipment in this application.
  • Fig. 8 shows a side view of the camera parameter calibration device in the present application with the housing removed;
  • FIG. 9 shows the connection structure diagram of the calibration board of the camera parameter calibration equipment in this application.
  • FIG. 10 shows a schematic diagram of the fourth support plate of the camera parameter calibration device in the present application.
  • FIG 11 shows the camera parameter calibration equipment in this application
  • FIG. 12 shows a schematic diagram of the state of measuring the distance of the sweeping robot when the camera parameter calibration device in the present application is provided with a distance measuring component
  • FIG. 1 is a schematic flowchart of a method for calibrating camera parameters provided by an embodiment of the present disclosure.
  • the method can be applied to the parameter calibration of various types of cameras. Specifically, for example, it can be applied to the parameter calibration of the camera of an automatic mobile device.
  • the automatic mobile device can be a sweeping robot, a sweeping robot, an automatic lawn mower, Any electronic device or smart device that can move automatically, such as a snow machine. Specifically, as shown in Figure 1, it includes:
  • step S100 the camera is set at a specific position, so that the camera and the N calibration parts meet the specific positional relationship, N ⁇ 3.
  • the calibration element may be a calibration component or assembly of any shape or type with feature points of the calibration element, and the feature points of the calibration element may be visual feature points arranged in an orderly or disorderly manner on the calibration element.
  • the calibration member may be a checkerboard calibration board
  • the checkerboard calibration board may be a square board
  • the surface used for calibration has a checkerboard
  • the corners of the checkerboard can be used as the checkerboard. Describe the visual feature points of the calibration board, which can be used as the feature points of the calibration piece.
  • the calibration component may also be other calibration components with visual feature points, such as a calibration block and a calibration sheet with visual feature points, which are not limited in this application.
  • the number N of calibration pieces can be any value greater than or equal to 3, for example, it can be 4, 5, and so on.
  • the greater the number of calibration parts the higher the calibration accuracy of the camera parameters. Therefore, in the specific implementation process, the implementer can determine the number of calibration parts according to actual conditions or actual calibration accuracy requirements. This is not limited.
  • step S200 the camera is controlled to acquire images of calibration parts, and each calibration part image includes images of N calibration parts.
  • the calibration part image is an image obtained by the camera shooting once. Since the camera and the calibration part meet a specific positional relationship, the calibration part image obtained by the camera at one time can include images of N calibration parts.
  • the calibration component may be a calibration board with a checkerboard.
  • the calibration component images acquired by the camera at one shot include N checkerboard images on the calibration board.
  • step S300 the internal and external parameters of the camera are calculated according to the position data of the image feature points in the calibration piece image and the position data of the calibration piece feature points of the calibration piece.
  • the pinhole camera model is used to realize the 2D-3D coordinate conversion of the corner points, which means that the three-dimensional space points are mapped to the two-dimensional imaging plane, and the internal and external parameters of the camera can be calculated through a single image, which simplifies the calibration process , Improve the calibration efficiency.
  • step S100 includes:
  • step S110 all the N calibration parts are located within the field of view of the camera, and the distance between the optical center of the camera and a specific calibration part of the N calibration parts is a predetermined distance, and the specific calibration part is the calibration part directly facing the camera.
  • the image of the calibration piece contains 4 calibration boards to ensure that all the checkerboards used for calibration are within the field of view of the camera, and each grid of each calibration board is located in the generated image area to be calibrated.
  • the distance between the frame and the image of the calibration piece is less than 2mm.
  • step S110 includes:
  • step S111 the automatic mobile device is placed on a predetermined positioning board.
  • the automatic mobile device is a sweeping robot
  • the camera is located on the top of the sweeping robot. Place the sweeping robot on the positioning plate so that the camera above the sweeping robot can obtain the image of the calibration piece.
  • a groove is provided on the positioning plate to fix the driving wheel of the cleaning robot.
  • Step S112 according to the detection information of the laser sensor arranged around the positioning plate, adjust the automatic moving device to place it horizontally on the positioning plate.
  • a laser sensor is set directly above any two corner points of the positioning plate, and the optical center of the camera on the automatic mobile device is measured at the same time.
  • the distance measured by the four laser sensors is equal to the preset distance
  • the automatic moving equipment is considered to be in place and the camera parameters can be calibrated.
  • the distance between the corner points of multiple calibration parts and the optical center of the camera is acquired through the laser sensor, which provides a high-precision focal length data for the pinhole camera model, so that the internal and external parameters can be calibrated through a single image.
  • the coordinate conversion and other calculations in the calibration process consume a short time, which greatly reduces the requirements on the operator, saves time, and improves the robustness and accuracy of the calibration.
  • step S300 includes:
  • step S310 it is detected whether the image feature points in the calibration piece image meet the preset conditions.
  • This state is mainly used to determine whether the acquired image meets the calibration requirements. If the acquired image itself does not meet the effective segmentation and corner detection of the calibration board required for the calibration parameters, the detection calibration board will be displayed on the terminal interface Failure prompts, the reasons for failure include but are not limited to too dark light, the camera's view of the checkerboard is partially obstructed, etc. It is finally reflected in the program that it is unable to effectively detect the inner corners of the checkerboard that meets the standard, such as the use of It is a more mature detection method in opencv for detection, and the return value of the function is used to determine whether the corner detection is successful. If the detection failure is caused by too dark light, another method can be used to judge in advance.
  • the brightness of the collected image will be detected first, because when the tooling is designed, the light strip will be used to check the brightness. The entire scene is lit to ensure that the area to be imaged is adequate regardless of changes in the external environment. Therefore, if the overall brightness of the image is lower than the set threshold, it will be determined that the calibration board has failed.
  • the fault state is mainly due to the fact that the operator has problems when placing the machine on the positioning plate. For example, randomly placing the machine with the camera to be calibrated on the positioning plate, but not according to the positioning The setting of the board will jam the machine, which will cause the machine to have a large deviation of the pitch angle or/and the roll angle or/and the yaw angle. If the deviation is caused by the installation of the machine itself, this angle can be marked by an external parameter; However, if the error is caused by personnel's operating errors, the error will not be eliminated cumulatively. This is a big potential risk. Therefore, in response to this problem, the program will judge the above angles and pass the set The threshold is used to constrain the problem. If the angle exceeds the specified threshold, the fault of abnormal machine attitude will be displayed on the terminal interface, which will remind the operator to check the placement of the machine and effectively prevent the occurrence of this situation.
  • step S320 in the case that the image feature points meet the preset conditions, the internal parameters and external parameters of the camera are calculated according to the coordinates of the image feature points and the coordinates of the calibration piece feature points.
  • step S320 includes:
  • Step S321 according to the coordinates of the image feature points of the N-1 calibration parts except the specific calibration part and the coordinates of the calibration part feature points of the N-1 calibration parts in the calibration part image, the internal parameters of the camera are calculated .
  • the specific calibration component is a calibration component parallel to the camera, and the remaining three calibration components are arranged obliquely with respect to the camera.
  • the variables that need to be calibrated include focal length (fx, fy), aperture center (cx, cy), and distortion parameters (k1, k2, p1, p2), and obtain the three detected parameters required for internal parameter calibration.
  • the 2D coordinates of the corner points of the checkerboard of a calibration piece are known, so the 3D point coordinates corresponding to each corner point can be easily obtained, and the z-axis is set to 0.
  • the pose of the camera can be estimated by the pnp method, so the initial relative pose of each divided checkerboard can be obtained, which is used to construct the correspondence of multiple sets of internal corner points in multiple checkerboards Relations are solved by iterative optimization by constructing a least squares problem.
  • Step S322 Based on the calculated internal parameters, the external parameters of the camera are calculated according to the coordinates of the image feature points of the specific calibration part and the coordinates of the calibration part feature points of the specific calibration part.
  • the specific calibration part is the calibration part facing the camera. .
  • the external parameters of the camera are calculated through the pinhole camera model.
  • the coordinates of the image feature points are sub-pixel precision two-dimensional coordinates of the image feature points, and the sub-pixel precision two-dimensional coordinates are calculated in the following manner:
  • the original two-dimensional coordinates of the image feature points are calculated iteratively to obtain the sub-pixel precision two-dimensional coordinates of the image feature points.
  • feature detection and position calculation are performed on each of the divided chessboards.
  • the feature here is the detected corner points.
  • the internal corner points of each sub-image after segmentation are detected. If the corner points of the set template number are detected and arranged in a certain order, all internal corners can be obtained.
  • the coordinates of the points, the inner corner points can be sequentially numbered according to the order of the rows or columns, and the coordinates at the pixel level are acquired at this stage.
  • the sub-pixel precision corner point coordinates can be obtained through the iterative solution method to improve the accuracy, thereby making the parameter calibration result more accurate.
  • the internal parameters of the camera calculated in step S321 include:
  • the image is preprocessed.
  • filtering algorithms such as Gaussian filtering, median filtering, and mean filtering
  • the final Gaussian filtering has the best preprocessing effect.
  • the acquired image is processed by the Gaussian filtering method.
  • this linear filtering method is based on the weighted average process of the entire image.
  • the value of each pixel is obtained by the weighted average of its own and other pixel values in the field, which can effectively suppress obedience to normality.
  • Distributed noise Gaussian filtering essentially applies a two-dimensional normal distribution to a two-dimensional matrix, and the value of G(x, y) represents the weight on the matrix. Then normalize the results of Gaussian filtering to smooth the image.
  • x and y respectively represent the abscissa and ordinate of the pixel in the image to be calibrated, and s represents the variance of the Gaussian model.
  • a laser sensor is also set on each calibration board to obtain the distance from the camera's optical center to each corner point of the top-view checkerboard, which is a prerequisite for high-precision calibration of subsequent parameters.
  • the internal corner points are detected for each sub-image after segmentation. If the corner points of the set number of templates are detected and arranged in a certain order, the coordinates of all the internal corner points can be obtained.
  • M represents the intensity of the corresponding pixel location
  • A represents the intensity of the background
  • B represents the peak of the intensity in the bright area
  • (u, v) represents the location of the peak of the bright area
  • M(x, q) represents the pixel
  • I(x) represents the estimated value of the pixel intensity of the Gaussian model
  • the true position information of the corner points in the image to be calibrated can be detected.
  • M(x,q) I(x), that is, the intensity of the pixel is equal to the intensity estimated by the model, but due to the existence of various factors, the two cannot be completely equal, so the least square method can be used To find the optimal solution, namely formula (2), and iteratively optimize the formula (2) to obtain the global optimal solution or the local optimal solution.
  • the end condition of the iteration is to limit the number of iterations and limit the accuracy range, such as setting iteration If the number of times is 50, or if the result change range is within 0.2% for several consecutive times, the iteration will be stopped.
  • the number of iterations and the limited accuracy range are set according to actual needs.
  • the 2D coordinates of the three checkerboard corner points detected for internal parameter calibration can be obtained separately. Since the obtained image itself is not distorted, the obtained coordinates are also distorted coordinates. According to the initial The calibration parameters can be used to distort the acquired image and obtain the inner corner coordinates after distort.
  • the external parameters of the camera calculated in step S322 include:
  • the corresponding relationship between the two-dimensional coordinates and the three-dimensional coordinates is constructed based on the two-dimensional coordinates of the image feature points of the corresponding specific calibration parts and the three-dimensional coordinates of the feature points of the calibration parts;
  • the parameters of the corresponding objective function are calculated iteratively until the reprojection error is less than the preset threshold, and the external parameters are obtained.
  • the world coordinate P c corresponding to the inner corner point is projected to the camera coordinate P'through the pinhole camera model:
  • the camera is abstracted as a pinhole camera model for explanation.
  • the imaging principle of a single camera can be approximated as a pinhole camera model, which is simple and effective, which means that a three-dimensional space point is mapped to a two-dimensional imaging plane.
  • x points to the right
  • y points to the down
  • z points to the front of the camera (subject to common abstract models, regardless of the order of the actual camera coordinate system).
  • f is the focal length
  • m is the zoom factor of the inner corner point on the u axis after being converted to the pixel coordinate system
  • n is the zoom factor of the inner corner point on the v axis after being converted to the pixel coordinate system
  • [c x ,c y ] T Is the translation of the origin
  • the intermediate matrix K is the internal parameter matrix, and the parameters of the intermediate matrix K are calibrated by Zhang Zhengyou calibration method
  • P c TP w
  • T is the transformation matrix
  • R is the rotation parameter
  • t is the translation parameter.
  • the coordinates of a three-dimensional point in the camera coordinate system are not easy to know, but it is easy to obtain the space point coordinates in the world coordinate system.
  • the pinhole model is an ideal model.
  • the tangential and radial distortion caused by the camera lens will cause the position of the light to pass through the lens and project to the imaging surface to change.
  • the distortion parameters In the external parameter calibration link, the methods for obtaining the coordinates of 3D points and 2D points are similar, and they are also optimized by constructing a least squares problem.
  • the coordinates of the camera's optical center on the x, y, and z axes need to be calculated separately. The coordinates can be calculated by the standard size of the tooling. It should be noted that the accuracy of the external parameters is largely dependent on the higher of the tooling itself. Accuracy, that is, the distance between the calibration plate and the optical center, the laser sensor obtains high-precision measurement data.
  • the internal parameter calibration solution requires 8 variables to be optimized, which are the internal parameter variables: focal length parameters (fx, fy), aperture center parameters (cx, cy) and distortion parameter variables (k1, k2, p1, p2), through iteration
  • the optimized variable value can be finally obtained, and the corresponding maximum projection error and average projection error can be calculated at the same time to measure the accuracy of the result.
  • the solution of the external parameters involves the solution of the rotation matrix and the translation matrix. Based on the internal parameters that have been solved, the optimization problem can also be constructed to minimize the reprojection error and iteratively solve the external parameter variables.
  • the mainstream method of nonlinear optimization is graph optimization.
  • the optimization problem can be described as: given objective function, under corresponding constraint conditions Next, optimize the variables to be optimized, and finally get the optimal solution.
  • Graph optimization is to express the optimization problem in the form of a graph. It constructs the corresponding relationship from 3D to 2D, and iteratively solves it by minimizing the reprojection error.
  • the objective function is as follows:
  • the e function represents the error represented by the difference between the observation zi and the observation equation, and the information matrix ⁇ is the inverse of the covariance matrix.
  • This formula is the optimization problem composed of multiple nodes and edges in Figure 3.
  • an initial value and iterative direction are required.
  • an increment ⁇ x k is needed to calculate the current estimated value.
  • the Jacobian matrix and Hessian matrix According to the selected gradient descent strategy (that is, determine the increment ⁇ x k ), the more representative ones are the GN and LM algorithms. Both of these are solving incremental equations. If ⁇ x k is small enough during the solving process, the iteration stops, otherwise the iteration continues.
  • the method further includes:
  • the design in the program can independently determine the quality of the calibration result without human judgment.
  • the result is judged by reprojection error, including the largest projection error and average projection error, to ensure that the final result is reliable. If the obtained projection error result exceeds the set threshold, the fault detection will be The terminal interface displays a prompt that the result is unreasonable, so that after seeing the prompt, the operator will do some tests on the machine, and then try to re-calibrate.
  • the fault state is mainly to prevent the operator from issuing instructions for calibration multiple times. Since the calibration program is only a part of the entire machine running program, this part of the program consumes Time needs to be effectively controlled, so the asynchronous scheme is adopted when the program is designed.
  • the calibration program itself will actively construct an asynchronous thread for calculation, and will not block the operation of the main thread, and before the start of the standard stator thread has finished running , Even if the calibration instruction is received again, it will not respond again.
  • the fault detection will display the task busy on the terminal interface, so that the operator can be told that there is no need to repeat the operation, and wait for the result to return and then proceed according to the success or failure of the calibration Next step.
  • Detect the file writing status the status is mainly used to determine that the result is valid and written to the memory of the machine. After the calibration is completed, the generated calibration result will be written to the specified calibration file, in order to ensure the calibration result It has been completely written into the calibration file. After writing, it will read the file again. If there is no problem in reading and writing, the calibration is considered successful. If the calibration result is not completely written into the calibration file, the fault detection will display the file writing failure on the terminal interface, so the operator will perform the calibration again according to the prompt.
  • controlling the camera to obtain the calibration component includes:
  • the calibration board is fixedly installed in a semi-enclosed box, and a light source is arranged inside the box.
  • the illumination compensation of the calibration board is performed when the camera is shooting, which eliminates the interference of external light and also realizes
  • the device can work around the clock.
  • two sets of light strips are installed inside the semi-enclosed tooling. Regardless of the external light, the device can provide suitable light to obtain a reliable image to be calibrated.
  • the camera is controlled to take pictures of the marker through wireless means such as Bluetooth, which is convenient for the staff to operate.
  • the calibration element includes a calibration board with a checkerboard.
  • the planes on which the checkers on the N calibration boards are located intersect each other.
  • the image feature points are the checkerboard corner points of the calibration board in the image.
  • the feature points of the parts include the corner points of the checkerboard of the calibration board.
  • the calibration device has four checkerboard calibration boards, one of which is horizontally facing downwards, and the other three are arranged obliquely respectively.
  • the planes on which the four calibration boards are located intersect and are not parallel to each other.
  • the sides are all facing the same direction, and the four calibration boards are fixed by screws and brackets.
  • at least one laser sensor is installed on the other three calibration boards except for the horizontally downward calibration board, and the laser sensor detects whether the machine with the camera on the top is placed horizontally on the positioning board, for example, a sweeping robot has its top
  • the plane is horizontal, and the lens of the camera to be calibrated is also horizontal.
  • the laser sensor can ensure the distance from the optical center of the camera to each corner of the top-view checkerboard, which is a prerequisite for the high precision of subsequent parameter calibration.
  • the specific calibration component Since the specific calibration component is used for external parameter calibration, it is necessary to ensure that the front of the camera faces its calibration surface, so as to ensure that the distance between the characteristic points of the calibration surface and the optical center of the camera is accurate. For automatic mobile equipment, it is necessary to ensure that the camera mounting surface is For a specific calibration part, on the one hand, a positioning plate parallel to the specific calibration part can be used to initially ensure that the camera mounting surface is parallel to the calibration surface of the specific calibration part. On the other hand, it is also necessary to adopt corresponding detection methods to determine whether the camera mounting surface is indeed parallel to the calibration surface of the specific calibration component.
  • the laser sensor detects whether the machine with the camera on the top is placed horizontally on the positioning board. At the same time, the laser sensor obtains the distance from the optical center of the camera to each corner of the top-view checkerboard, which is a prerequisite for the high accuracy of subsequent parameter calibration.
  • the pinhole camera model is used to realize the 2D-3D coordinate conversion of the corner points, which means that the three-dimensional space points are mapped to the two-dimensional imaging plane.
  • the distance between the corner points and the optical center of the camera obtained by the laser sensor provides a high precision for the pinhole camera model
  • the focal length data so that the calibration of internal and external parameters can be achieved through a single image, and the calculation of coordinate conversion during the calibration process takes a short time, which greatly reduces the requirements for the operator, saves time, and improves The robustness and accuracy of the calibration are improved.
  • the present application also provides a camera parameter calibration device.
  • the device can be applied to execute the above method, and can be applied to camera calibration of automatic mobile equipment.
  • 4 is a schematic diagram of the module structure of an embodiment of a camera parameter calibration device provided by the present disclosure. Specifically, as shown in FIG. 4, the device may include:
  • Calibration module 401 including N calibration parts, N ⁇ 3;
  • the camera positioning unit 402 is used for positioning the camera 403 at a specific position so that the camera 403 and the N calibration parts meet the specific positional relationship;
  • the control unit 404 is configured to control the camera 403 to obtain calibration piece images, and each calibration piece image contains images of N calibration pieces;
  • the processing unit 405 is configured to calculate the internal and external parameters of the camera based on the position data of the image feature points in the calibration piece image and the position data of the calibration piece feature points of the calibration piece.
  • the calibration module further includes a semi-closed box 1.
  • the calibration component is a calibration plate, for example, there are four calibration plates, that is, the above-mentioned N is 4, four
  • the two calibration boards are the first calibration board 2, the second calibration board 3, the third calibration board 4, the fourth calibration board 5, at least four laser sensors 6 and the calibration module, of which: the wider semi-enclosed box 1 An opening is provided at the bottom of one side; the first calibration plate 2 is horizontally fixed on the top of the semi-closed box 1; Side; the fourth calibration board 5 is fixed on the opposite side of the semi-enclosed box 1 with the opening side; at least four laser sensors 6 are respectively provided on the first calibration board 2, the second calibration board 3, the third calibration board 4 and At the vertex of the bottom end of the fourth calibration board 5; the calibration module is connected to at least four laser sensors 6 through communication; among them, the first calibration board 2, the second calibration board 3, the third calibration board 4, and the fourth calibration board 5
  • the planes intersect in pairs and are
  • the first calibration board 2 is installed horizontally on the top of the semi-closed box 1, and its calibration surface is located directly above the positioning board 7 and parallel to the positioning board; the second calibration board 3, the third calibration board 4, and the fourth calibration board
  • the calibration plates 5 are respectively installed on the side surfaces of the semi-closed box 1, and their calibration surfaces are all inclined toward the positioning plate 7.
  • the calibration module is set on the machine where the camera to be calibrated is located, and is connected to at least four laser sensors through a wireless network or Bluetooth.
  • the calibration module processes and calculates the image of the camera to be calibrated, and combines the distance information from each calibration board received from the laser sensor to the optical center of the camera to be calibrated to calibrate the parameters of the camera.
  • the calibration module is also controlled by an external Bluetooth signal to start the calibration thread.
  • the external Bluetooth signal can be generated by the Bluetooth switch and sent to the calibration module.
  • the calibration module also includes a positioning groove 7 fixed above the bottom plate on the side of the semi-closed box 1 with an opening.
  • the shape of the positioning groove 7 is consistent with the contour of the bottom of the machine.
  • the first horizontal scale line 8 and the second longitudinal scale line 9 are arranged below the positioning slot 1, and the first horizontal scale line 8 and the second longitudinal scale line 9 are perpendicular to each other, the first horizontal scale line 8 and the second longitudinal scale line The intersection of 9 passes through the center of the positioning slot 7.
  • the calibration module also includes a first support plate 10, which is horizontally fixed in the semi-closed box body 1 through a first bracket 11; a second support plate 12, which is horizontally fixed to the semi-closed box body through a second bracket 13
  • first support plate 10 and the second support plate 12 are at the same height, and the first support plate 10 and the second support plate 12 are parallel to one side of the semi-closed box 1
  • the third support plate 14 passes through the first
  • the support block 26 and the second support block 27 are fixed on the edge side of the first support board 10 and the second support board 12;
  • the second calibration board 3, the third calibration board 4 and the fourth calibration board 5 are respectively inclined and fixed on the first On the supporting plate 10, the second supporting plate 12 and the third supporting plate 14.
  • the third support plate 14 is fixed on the edge side of the first support plate 10 and the second support plate 12 by screws and the first support block 26 and the second support block 27.
  • the support plate is provided with a card slot, and the calibration plate can be directly fixed on the support plate through the card slot.
  • the calibration module further includes: a first vertical connector 15 which is fixed in parallel under the first support plate 10 by at least two screws; a second vertical connector 16 which is fixed horizontally by at least two screws
  • the first vertical connecting piece 15 is far away from the inner wall of the semi-closed box 1; the rotatable connecting piece 17 is connected to the second vertical connecting piece 16 through a pin 18; the first slot plate 19, the back of which is connected to the rotatable connecting piece 17 Fixed, used to place the second calibration board 3.
  • a T-shaped connector is pasted on the back of the first slot plate 19 to connect with the second vertical connector 16.
  • the pin shaft 18 can slide in the sliding rail of the rotatable connecting member 17 so as to change the inclination angle of the second calibration plate 3.
  • the remaining two calibration plates are also fixed on the corresponding support plates through the above-mentioned structure.
  • the calibration module further includes: a fourth support plate 20, which is fixed above the first support plate 10 and the second support plate 12 by four support columns 21, and the fourth support plate 20 is high On the third support plate; the second slot plate 22, the back of which is connected with the fourth support plate 20 by fasteners, the second slot plate 22 is located below the fourth support plate 20 for placing the first calibration plate 2.
  • the fourth supporting plate 20 is provided with a second horizontal scale line 23 and a second longitudinal scale line 24 on the side facing the second slot plate 22; the intersection of the second horizontal scale line 23 and the second longitudinal scale line 24 passes through the fourth supporting plate 20 in the center.
  • the second horizontal scale line 23 and the second longitudinal scale line 24 exceed the projection of the first calibration plate in the vertical direction of the fourth support plate, which is convenient for installation by workers.
  • the calibration module also includes light strips 25 arranged on either side of adjacent openings or opposite openings at the bottom of the semi-closed box 1. By setting the light strips, the image is supplemented with light during the calibration process. Improve the pass rate of the calibration image.
  • the box opening of the box 1 faces the positioning plate, and the calibration piece 101 is installed inside the semi-closed box 1.
  • the semi-closed box reserves windows for the placement and removal of automatic mobile devices, and at the same time creates a dim environment for the calibration module , Combined with the light strip 25 arranged in the semi-enclosed box, provides lighting during the calibration process.
  • the camera positioning unit is configured to:
  • All the N calibration parts are located in the field of view of the camera, and the distance between the optical center of the camera and a specific calibration part among the N calibration parts is a predetermined distance, and the specific calibration part is the calibration part directly facing the camera.
  • the device is applied to an automatic mobile device including a camera.
  • the camera positioning unit includes:
  • the positioning board is used to place the automatic moving equipment so that all the N calibration parts are in the field of view of the camera;
  • the detection sensor is used to provide detection information that reflects whether the automatic mobile device is placed horizontally.
  • the above-mentioned positioning slot is provided on the positioning plate to restrict the movement of the automatic mobile device and ensure that the position of the automatic mobile device remains unchanged during the calibration process.
  • a positioning plate is arranged below the calibration module for placing the automatic moving equipment, and the positioning plate and the N calibration parts meet a specific positional relationship.
  • the positioning plate can also be arranged above the calibration module, as long as the two are arranged relative to each other.
  • the positioning plate arranged below the calibration module as an example to illustrate the specific structure.
  • one of the N calibration parts is located directly above the positioning plate.
  • the locating plate and the N calibration pieces satisfying a specific positional relationship may include: the locating plate is parallel to the calibration surface of the specific calibration piece 2.
  • satisfying a specific positional relationship between the positioning plate and the N calibration parts includes:
  • the calibration surfaces of the N calibration pieces are all within the field of view of the camera of the automatic mobile device placed on the positioning plate.
  • the four calibration plates are all within the field of view of the camera.
  • the camera can capture images of N calibration boards at a time.
  • it also includes:
  • the detection unit is used to detect whether the image feature points in the calibration part image meet the preset conditions
  • the processing unit calculates the internal and external parameters of the camera according to the coordinates of the image feature points and the coordinates of the calibration piece feature points.
  • the device may further include a distance measuring component, which is arranged at a specific position above the positioning plate, and is used to measure the distance from the camera installation surface of the automatic mobile device.
  • the distance measuring component may include four laser distance measuring sensors A, B, C, and D.
  • the laser sensors A and B may be set in the third calibration.
  • the laser sensors C and D can be arranged at the two corners of the lower end of the second calibration plate 3.
  • the laser sensors A, B, C, and D are located on the same plane, and the same plane where A, B, C, and D are located is parallel to the first calibration plate 2 at the top. In this way, the four laser sensors can detect them respectively.
  • the distance to four different points on the upper surface of the sweeping robot ie, the camera mounting surface
  • the alarm configured by the sensor can be used to generate prompt information, such as indicator light information or sound indication information, to instruct the implementer to adjust the placement position of the sweeping robot to make it
  • the camera mounting surface is parallel to the first calibration plate 2.
  • the laser sensors may also be arranged in other positions, and the number of laser sensors only needs to be greater than or equal to 3, which is not limited in the present application.
  • the distance measuring component includes at least three laser distance measuring sensors, which are arranged at different positions on the same plane above the positioning plate, and the laser beam irradiates different positions on the camera mounting surface to measure the distance from different positions to the same plane. , To determine whether the camera mounting surface is parallel to the calibration surface of the specific calibration component, wherein the same plane is parallel to the calibration surface of the specific calibration component.
  • the laser sensor can ensure the light of the camera.
  • the distance from the heart to each corner of the top-view checkerboard is a prerequisite for the high accuracy of the subsequent internal and external parameter calibration.
  • the processing unit is configured to: according to the coordinates of the image feature points of the N-1 calibration parts except the specific calibration part in the calibration part image, and the calibration part features of the N-1 calibration parts
  • the coordinates of the points are calculated to obtain the internal parameters of the camera; and based on the calculated internal parameters, the external parameters of the camera are calculated according to the coordinates of the image feature points of the specific calibration part and the coordinates of the calibration part feature points of the specific calibration part;
  • the calibration part is the calibration part facing the camera.
  • the processing unit is also used to iteratively calculate the original two-dimensional coordinates of the image feature points to obtain sub-pixel precision two-dimensional coordinates of the image feature points.
  • the processing unit is also configured to: use the two-dimensional coordinates of multiple sets of corresponding image feature points and the three-dimensional coordinates of the calibration piece feature points to construct the correspondence between two-dimensional coordinates and three-dimensional coordinates; use multiple sets of corresponding two-dimensional coordinates and In three-dimensional coordinates, iterative calculations are performed on the parameters of the corresponding objective function until the reprojection error is less than the preset threshold to obtain internal parameters.
  • the internal parameters include focal length parameters, aperture center parameters, and distortion parameters; based on the calculated internal parameters, Set the corresponding two-dimensional coordinates of the image feature points of the specific calibration piece and the three-dimensional coordinates of the calibration piece feature points to construct the corresponding relationship between the two-dimensional coordinates and the three-dimensional coordinates; use multiple sets of corresponding two-dimensional and three-dimensional coordinates to correspond
  • the parameters of the objective function of the relationship are calculated iteratively until the reprojection error is less than the preset threshold, and the external parameters are obtained.
  • the processing unit is further configured to: determine whether the calculated internal parameters and external parameters meet the preset standard; and output a prompt message with a judgment result that does not meet the preset standard.
  • the device also includes a memory for storing internal and external parameters that meet preset standards.
  • the device also includes a lighting component for providing specific lighting conditions for the camera; a control instruction sending unit and a wireless communication unit; the control instruction sending unit sends a control instruction for acquiring the image of the calibration piece to the camera through the wireless communication unit, and the control instruction sending unit is used for Control the camera to obtain the image of the calibration part.
  • the calibration piece includes a calibration board with a checkerboard.
  • the planes on the N calibration boards intersect in pairs.
  • the image feature points are the checkerboard corners of the calibration board in the image.
  • the calibration piece feature points include the points of the calibration board. Checkerboard corner points.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

本申请公开了一种相机参数标定方法及装置,所述方法包括:将相机设置于特定位置,使相机与N个标定件满足特定位置关系,N≥3;控制相机获取标定件图像;每幅标定件图像包含N个标定件的图像;根据标定件图像中的图像特征点的位置数据,和标定件的标定件特征点的位置数据,计算得到相机的内参数和外参数。利用本申请各实施例,可以有效节省标定时间,提高了标定的鲁棒性和精度。

Description

相机参数标定方法及装置 技术领域
本申请涉及计算机视觉技术领域,具体涉及一种相机参数标定方法及装置。
背景技术
在计算机视觉领域,一般情况下为确定空间物体表面某点的三维几何位置与其在图像像素对应点之间的相互关系,需要构建相机成像模型,这些模型参数就是相机参数。在大多数条件下这些参数必须通过实验与计算才能得到,这个求解内参、外参以及畸变参数的过程就称之为相机标定。
现有的相机标定方式中,通常需要多次从不同角度对棋盘格等标定件进行拍摄以获取多张图像,而后利用多张图像构建约束,分别对相机的内外参数进行求解,整个过程复杂且易受外界环境干扰,导致标定效率较低且精度较低。
发明内容
本申请提供了一种相机参数标定方法及装置,以提高相机参数的标定效率和精度。
本申请实施例提供了一种相机参数标定方法,包括:
将相机设置于特定位置,使相机与N个标定件满足特定位置关系,N≥3;
控制相机获取标定件图像,每幅标定件图像包含N个标定件的图像;
根据标定件图像中的图像特征点的位置数据,和标定件的标定件特征点的位置数据,计算得到相机的内参数和外参数。
可选地,将相机设置于特定位置,使相机与N个标定件满足特定位置关系包括:
使N个标定件全部位于相机的视野范围内,以及使相机的光心与N个标定件中的一特定标定件的距离为预定距离,特定标定件为相机正对的标定件。
可选地,方法应用于包括相机的自动移动设备,对应的,使N个标定件全部位于相机的视野范围内,以及使相机的光心与N个标定件中的一特定标定件的距离为预定距离包括:
将自动移动设备放置在预定的定位板上;
根据定位板周边设置的激光传感器的检测信息,调整自动移动设备使其水平放置于定位板上。
可选地,根据标定件图像中的图像特征点的位置数据,和标定件的标定件特征点的位置数据,计算得到相机的内参数和外参数包括:
检测标定件图像中的图像特征点是否符合预设条件;
在图像特征点符合预设条件的情况下,根据图像特征点的坐标,和标定件特征点的坐标,计算得到相机的内参数和外参数。
可选地,根据图像特征点的坐标,和标定件特征点的坐标,计算得到相机的内参数和外参数包括:
根据标定件图像中,除特定标定件之外的N-1个标定件的图像特征点的坐标,和N-1个标定件的标定件特征点的坐标,计算得到相机的内参数;
基于计算得到的内参数,根据特定标定件的图像特征点的坐标,和特定标定件的标定件特征点的坐标,计算得到相机的外参数,特定标定件为相机正对的标定件。
可选地,图像特征点的坐标为图像特征点的亚像素精度的二维坐标,亚像素精度的二维坐标包括采用下述方式计算得到:
对图像特征点的原始二维坐标进行迭代计算,得到图像特征点的亚像素精度的二维坐标。
可选地,计算得到相机的内参数包括:
利用多组相对应的图像特征点的二维坐标和标定件特征点的三维坐标,构建二维坐标与三维坐标的对应关系;
利用多组相对应的二维坐标和三维坐标,对对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到内参数,内参数包括焦距参数、光圈中心参数、畸变参数。
可选地,计算得到相机的外参数包括:
基于计算得到的内参数,根据多组相对应的特定标定件的图像特征点的二维坐标和标定件特征点的三维坐标,构建二维坐标与三维坐标的对应关系;
利用多组相对应的二维坐标和三维坐标,对对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到外参数。
可选地,方法还包括:
确定计算得到的内参数和外参数是否符合预设标准;
若符合预设标准,则将内参数和外参数写入存储器;
若不符合预设标准,则提供相应的提示信息。
可选地,控制相机获取标定件包括:
利用预设的照明部件为相机提供特定的光照条件;
向相机无线发送获取标定件图像的控制指令,控制相机获取标定件图像。
可选地,标定件包括具有棋盘格的标定板,N个标定板上的棋盘格所在的平面两两相交,对应的,图像特征点为图像中标定板的棋盘格角点,标定件特征点包括标定板的棋盘格角点。
本申请实施例还提供一种相机参数标定装置,装置包括:
标定模组,包括N个标定件,N≥3;
相机定位单元,用于将相机定位于特定位置,使相机与N个标定件满足特定位置关系;
控制单元,用于控制相机获取标定件图像,每幅标定件图像包含N个标定件的图像;
处理单元,被配置为根据标定件图像中的图像特征点的位置数据,和标定件的标定件特征点的位置数据,计算得到相机的内参数和外参数。
可选地,相机定位单元,被配置为:
使N个标定件全部位于相机的视野范围内,以及使相机的光心与N个 标定件中的一特定标定件的距离为预定距离,特定标定件为相机正对的标定件。
可选地,装置应用于包括相机的自动移动设备,对应的,相机定位单元包括:
定位板,用于放置自动移动设备,使N个标定件全部位于相机的视野范围内;
检测传感器,用于提供反应自动移动设备是否水平放置的检测信息。
可选地,还包括:
检测单元,用于检测标定件图像中的图像特征点是否符合预设条件;
在检测单元检测到图像特征点符合预设条件的情况下,处理单元根据图像特征点的坐标和标定件特征点的坐标,计算得到相机的内参数和外参数。
可选地,处理单元被配置为:根据标定件图像中,除特定标定件之外的N-1个标定件的图像特征点的坐标,和N-1个标定件的标定件特征点的坐标,计算得到相机的内参数;以及基于计算得到的内参数,根据特定标定件的图像特征点的坐标,和特定标定件的标定件特征点的坐标,计算得到相机的外参数;特定标定件为相机正对的标定件。
利用本申请所提供的各种实施方式,通过将相机设置于特定位置,使相机与多个标定件之间的位置关系满足标定图像获取的特定条件,可以一次性获取多个标定件的图像,即一幅图像包括多个标定件的图像,不仅可以有效提高标定效率,还可以保证图像采集质量的稳定,可以实现批量标定。
进一步的,通过采用针孔相机模型实现图像特征点的2D-3D坐标转换,表示将三维空间特征点映射到二维的成像平面上,利用激光传感器获取到标定件特征点(棋盘格角点)到相机光心的距离,为针孔相机模型提供了一个精度较高的焦距数据,通过采集到的包含多个标定件的图像的一幅图像,就可以实现内参和外参标定,可以进一步缩短标定时间,提高标定效率。
附图说明
通过参考附图会更加清楚的理解本申请的特征和优点,附图是示意性的而不应理解为对本申请进行任何限制,在附图中:
图1示出了本申请中的相机参数标定方法的流程图;
图2示出了针孔相机模型图;
图3示出了重投影误差原理图;
图4示出了本申请中的相机参数标定装置的结构图;
图5示出了本申请中的相机参数标定设备去外壳结构图;
图6示出了本申请中的相机参数标定设备的外部示意图;
图7示出了本申请中的相机参数标定设备的底部结构图;
图8示出了本申请中的相机参数标定设备的去外壳侧视图;
图9示出了本申请中的相机参数标定设备的标定板连接结构图;
图10示出了本申请中的相机参数标定设备第四支撑板示意图;
图11示出了本申请中的相机参数标定设备;
图12示出了本申请中的相机参数标定设备设有测距组件时对扫地机器人测距的状态示意图;
附图标记:1-半封闭箱体;2-第一标定板;3-第二标定板;4-第三标定板5-第四标定板;6-激光传感器;7-定位槽;8-第一横向刻度线;9-第一纵向刻度线;10-第一支撑板;11-第一支架;12-第二支撑板;13-第二支架;14-第三支撑板;15-第一垂直连接件;16-第二垂直连接件;17-可旋转连接件;18-销轴;19-第一卡槽板;20-第四支撑板;21-支撑柱;22-第二卡槽板;23-第二横向刻度线;24-第二纵向刻度线;25-灯带;26-第一支撑块;27-第二支撑块。
具体实施方式
以下将参考附图详细说明本公开的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。
另外,为了更好的说明本公开,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本公开同样可以实施。在一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本公开的主旨。
应理解,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系, 表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/“,表示前后关联对象是一种“或”的关系。
本公开实施例中出现的“多个”是指两个或两个以上。本公开实施例中出现的第一、第二等描述,仅作示意与区分描述对象之用,没有次序之分,也不表示本公开实施例中对个数的特别限定,不能构成对本公开实施例的任何限制。
图1是本公开一种实施例提供的一种相机参数标定方法的流程示意图。所述方法可以应用于各种类型相机的参数标定,具体的,比如可以应用于自动移动设备的相机的参数标定,所述自动移动设备可以是扫地机器人、扫拖机器人、自动割草机、扫雪机等任意可以自动移动的电子设备或智能设备。具体的,如图1所示,包括:
步骤S100,将相机设置于特定位置,使相机与N个标定件满足特定位置关系,N≥3。
其中,所述标定件可以是具有标定件特征点的任意形状或类型的标定部件或组件,标定件特征点可以是标定件上有序或无序排列的视觉特征点。比如,本申请一种实施例中,所述标定件可以是棋盘格标定板,所述棋盘格标定板可以是正方形板,用于标定的表面上具有棋盘格,棋盘格的角点可以作为所述标定板的视觉特征点,即可以作为标定件特征点。
当然,在本申请其他实施例中,所述标定件还可以是其他具有视觉特征点的标定件,比如具有视觉特征点的标定块、标定片等,本申请对此不 作限定。
其中,标定件的个数N可以是任意大于或等于3的数值,比如可以是4个、5个等。一般的,标定件的个数越多,相机参数的标定精度越高,因此,在具体实施过程中,实施人员可以根据实际条件或实际标定精度需求确定所述标定件的个数,本申请对此不作限定。
步骤S200,控制相机获取标定件图像,每幅标定件图像包含N个标定件的图像。
其中,所述标定件图像为所述相机拍摄一次所得到的图像,由于相机与标定件满足特定位置关系,可以使相机一次拍摄得到的标定件图像中包含N个标定件的图像。比如在本申请一种实施例中,标定件可以是具有棋盘格的标定板,对应的,相机一次拍摄获取的标定件图像中包含N个标定板上的棋盘格的图像。
步骤S300,根据标定件图像中的图像特征点的位置数据,和标定件的标定件特征点的位置数据,计算得到相机的内参数和外参数。
在本实施例中,根据标定件图像坐标系中的标定件棋盘格坐标,标定件在世界坐标系中的坐标,结合标定件与相机之间的距离、标定件指定特征点到相机光心距离,采用针孔相机模型实现角点的2D-3D坐标转换,表示将三维空间点映射到二维的成像平面上,通过一张图像即可计算得到相机的内参数和外参数,简化了标定过程,提高了标定效率。
作为可选的实施方式,步骤S100包括:
步骤S110,使N个标定件全部位于相机的视野范围内,以及使相机的 光心与N个标定件中的一特定标定件的距离为预定距离,特定标定件为相机正对的标定件。
在本实施例中,标定件图像中包含4个标定板,保证用于标定的棋盘格全部在相机的视野范围内,且每个标定板的每个格子均位于生成的待标定图像区域内。使相机光心与正对相机的标定板中心之间的距离固定:例如,使相机拍摄到的标定件图像中,棋盘格标定板的边框与标定件图像的边框重合;或者棋盘格标定板的边框与标定件图像的边框间距小于2mm。
作为可选的实施方式,方法应用于包括相机的自动移动设备,对应的,步骤S110包括:
步骤S111,将自动移动设备放置在预定的定位板上。
在本实施例中,自动移动设备为扫地机器人,相机位于扫地机器人的顶部位置。将扫地机器人放置在定位板上,使扫地机器人上方的相机能够获取标定件图像。
在具体实施例中,定位板上开设凹槽,以固定扫地机器人的驱动轮。
步骤S112,根据定位板周边设置的激光传感器的检测信息,调整自动移动设备使其水平放置于定位板上。
在本实施例中,在定位板任意两个角点的正上方分别设置一个激光传感器,同时对自动移动设备上的相机光心进行测距,当四个激光传感器测得的距离与预设距离一致时,认为自动移动设备已就位,可以进行相机参数的标定。
同时,通过激光传感器获取多个标定件的角点到相机光心距离,为针 孔相机模型提供了一个高精度的焦距数据,从而使得通过一张图像即可实现内参数和外参数的标定,并且标定过程中的坐标转换等计算耗费时间短,极大的减小了对操作人员的要求,节省了时间,提高了标定的鲁棒性和精度。
作为可选的实施方式,步骤S300包括:
步骤S310,检测标定件图像中的图像特征点是否符合预设条件。
在本实施例中,对标定件图像进行质量检测:
对于该状态主要是用来判断所获取的图像是否满足标定要求,如果采集的图像本身不满足可以对标定参数所需标定板的有效切分和角点检测,就会在终端界面显示检测标定板失败的提示,失败的原因包括但不仅限于光线太暗、相机采集棋盘格的视野被部分遮挡等,最终体现在程序上都表现出无法有效的检测到符合标准的棋盘格内角点,例如采用的是opencv中较为成熟的检测方法进行检测,通过函数的返回值来进行判定是否检测角点成功。如果是光线太暗导致的检测失败,还可以通过另外一种方式进行提前判断,来进行棋盘格划分之前会首先对采集的图像进行亮度的检测,由于在工装设计的时候,会通过灯带对整个场景进行打光,使得不管外界的环境如何变化,保证需要成像的区域光线是充足的,所以如果图像整体的亮度低于设定的阈值,则会判定检测标定板失败。
对相机的姿态状况进行检测:对于这个故障状态主要是放置在操作人员在将机器摆放在定位板环节出现问题,例如随机将装有待标定相机的机器摆放在定位板上,而未按照定位板的设置将机器卡住,这样就会导致机 器存在俯仰角或/与滚转角或/与偏航角的较大偏差,如果是机器本身安装导致的偏差,这个角度可以通过外参标出来;但是如果是由于人员的操作失误导致的误差,该误差就会累计上无法消除,这是一个很大的潜在风险,所以针对这一问题,在程序会对上述的角度进行判断,通过设定的阈值来对该问题进行约束,如果角度超过指定阈值,则会在终端界面显示机器姿态异常的故障,这样便会提醒操作人员对机器的摆放做检查,有效放置该情况的发生。
此外,为了彻底消除该情况的发生,在工装设计的时候,在定位板的对角线位置放置了两个激光传感器,通过激光传感器反馈的信息来做进一步的确认,确保机器本身是水平的放置在定位板上,两重检测都通过后,才会进入下一步的标定环节。
步骤S320,在图像特征点符合预设条件的情况下,根据图像特征点的坐标,和标定件特征点的坐标,计算得到相机的内参数和外参数。
在本实施例中,通过预先对标定件图像进行质量检测,只对合格的标定件图像进行参数计算,节省了计算资源,并且提高了相机参数标定的准确率。
作为可选的实施方式,步骤S320包括:
步骤S321,根据标定件图像中,除特定标定件之外的N-1个标定件的图像特征点的坐标,和N-1个标定件的标定件特征点的坐标,计算得到相机的内参数。
在本实施例中,设置4个标定件,特定标定件为与相机平行的标定件, 其余三个标定件相对相机倾斜设置。在内参标定环节,所需要标定的变量包括焦距(fx,fy)、光圈中心(cx,cy)和畸变参数(k1,k2,p1,p2),通过分别获取内参标定所需的检测到的三个标定件棋盘格角点的2D坐标,由于棋盘格的模板是已知的,所以可以很方便的获取每个角点对应的3D点坐标,z轴设为0。已知多对3D-2D对应点对,可以通过pnp方法对相机的位姿进行估计,所以可以得到每个分割棋盘格的初始相对位姿,用以构建多个棋盘格中多组内角点的对应关系,通过构建最小二乘问题进行迭代优化求解。
步骤S322,基于计算得到的内参数,根据特定标定件的图像特征点的坐标,和特定标定件的标定件特征点的坐标,计算得到相机的外参数,特定标定件为相机正对的标定件。
在本实施例中,根据步骤S321中计算出的内参数,特定标定件的图像特征点的坐标,和特定标定件的标定件特征点的坐标,通过针孔相机模型计算得到相机的外参数。
作为可选的实施方式,图像特征点的坐标为图像特征点的亚像素精度的二维坐标,亚像素精度的二维坐标包括采用下述方式计算得到:
对图像特征点的原始二维坐标进行迭代计算,得到图像特征点的亚像素精度的二维坐标。
在本实施例中,对分割后的每个棋盘格进行特征检测和位置计算。这里的特征就是检测到的角点,首先对分割后的每个子图像进行内角点的检测,如果设定的模板数量的角点均被检测到且是以一定的顺序排列,可以获取到所有内角点的坐标,内角点可以根据行或列的顺序进行顺序编号, 该阶段获取的是像素级别的坐标。在原有角点初始坐标的基础上,通过迭代求解的方法可以获取到亚像素精度的角点坐标以提高精度,从而使得参数标定结果精度更高。
作为可选的实施方式,步骤S321中计算得到相机的内参数包括:
利用多组相对应的图像特征点的二维坐标和标定件特征点的三维坐标,构建二维坐标与三维坐标的对应关系;
利用多组相对应的二维坐标和三维坐标,对对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到内参数,内参数包括焦距参数、光圈中心参数、畸变参数。
在计算内参数之前,先对图像进行预处理,通过对高斯滤波、中值滤波、均值滤波等滤波算法的实验对比,最终高斯滤波的预处理效果最好,通过高斯滤波方法对获取的图像进行预处理,这种线性滤波方式,原理是对整幅图像进行加权平均的过程,每个像素点的值都由其本身和领域内的其它像素值经过加权平均得到,可以有效的抑制服从正态分布的噪声。高斯滤波本质上是将二维正态分布应用在二维的矩阵上,G(x,y)的值表示矩阵上的权值。再对高斯滤波处理结果进行归一化,使图像得到平滑处理。
高斯滤波的计算公式如下:
Figure PCTCN2021095821-appb-000001
其中,x和y分别表示待标定图像中像素点的横坐标和纵坐标,s表示高斯模型的方差。
在各个标定板上也设置激光传感器,以获取相机光心到顶视棋盘格的每个角点的距离,为后续参数标定高精度的前提条件。
在本实施例中,对分割后的每个子图像进行内角点的检测,如果设定的模板数量的角点均被检测到且是以一定的顺序排列,可以获取到所有内角点的坐标。
对待标定图像的像素点的灰度分布特征通过高斯模型进行描述:
Figure PCTCN2021095821-appb-000002
对待标定图像中像素点的强度进行优化求解:
E(q)=∑(M(x,q)-I(x) 2)           (2)。
根据待标定图像中像素点的强度获取待标定图像中内角点对应的世界坐标P c=[X,Y,Z]以及待标定图像中内角点对应的相机坐标P'=[X',Y',Z']。
其中,M代表对应像素位置上的强烈程度,A表示背景的强烈程度,B表示亮区域中强烈程度的峰值,(u,v)表示亮区域峰值所在的位置;M(x,q)表示像素测得的强度,I(x)表示高斯模型的像素强度估计值;在相机坐标系下,X轴指向右,Y轴指向下,Z轴指向相机前方。
通过对图像中的像素进行强度分析计算,可以检测到角点在待标定图像中的真实位置信息。
我们假设M(x,q)=I(x),即像素的强度和模型估计出来的强度相等,但是由于各种因素的存在,两者是无法完全相等的,所以可以通过最小二乘的方法来求最优解,即公式(2),并且对公式(2)进行迭代优化求解, 得到全局最优解或者局部最优解,迭代的结束条件是限定迭代次数和限定精度范围,例如设置迭代次数为50次,或者连续几次结果变化范围在0.2%以内,则停止迭代。迭代次数及限定精度范围根据实际需要进行设置。
对焦距(fx,fy)、光圈中心(cx,cy)和畸变参数(k1,k2,p1,p2)进行标定:如图2所示,由于有四个标定板平面,并且三个标定板平面与相机镜头不平行,因此对应的焦距存在X轴或Y轴上的分量。焦距和光圈中心都是可以通过激光传感器直接获取的信息。
通过前述实施方式,可以分别获取内参标定所需的检测到的三个棋盘格角点的2D坐标,由于获取的图像本身是没有去畸变的,所以得到的坐标是也是具有畸变的坐标,根据初始的标定参数可以对获取的图像进行去畸变,获取去畸变之后的内角点坐标。
作为可选的实施方式,步骤S322中计算得到相机的外参数包括:
基于计算得到的内参数,根据多组相对应的特定标定件的图像特征点的二维坐标和标定件特征点的三维坐标,构建二维坐标与三维坐标的对应关系;
利用多组相对应的二维坐标和三维坐标,对对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到外参数。
本例中,通过针孔相机模型将内角点对应的世界坐标P c投影到相机坐标P':
Figure PCTCN2021095821-appb-000003
即,
Figure PCTCN2021095821-appb-000004
在内参的标定的过程中,涉及到几种坐标系,包括世界坐标系、相机坐标系、图像坐标系,在本实施例中,将相机抽象为针孔相机模型进行阐述。单个相机的成像原理可以近似为针孔相机模型,该模型简单且有效,表示将三维空间点映射到二维的成像平面上。相机坐标系下,x指向右,y指向下,z指向相机前方(以常见的抽象模型为准,和实际相机坐标系顺序无关)。真实世界中,在O-x-y-z相机坐标系下的一点Pc=[X,Y,Z],通过针孔相机模型,可以将其投影到二维成像平面O′-x′-y′上,投影点坐标为P′=[X′,Y′,Z′],通过相似三角形原理获得公式(3)和公式(4)。
将相机坐标P'转换到像素坐标[u,v] T上:
Figure PCTCN2021095821-appb-000005
代入公式(4),得
Figure PCTCN2021095821-appb-000006
将像素坐标转换为齐次坐标:
Figure PCTCN2021095821-appb-000007
将像素坐标转换到世界坐标系中:
Figure PCTCN2021095821-appb-000008
其中,f为焦距;m为内角点在转换到像素坐标系后在u轴的缩放倍数,n为内角点在转换到像素坐标系后在v轴的缩放倍数,[c x,c y] T为原点平移量;f x=mf,f y=nf;中间矩阵K为内参矩阵,中间矩阵K的参数采用张正友标定法进行标定;P c=TP w,T为变换矩阵,R为旋转参数,t为平移参数。
除了内参之外,一般情况下,三维空间点在相机坐标系下的坐标不容易知道,而容易获取世界坐标系下的空间点坐标,相机坐标系和世界坐标系之间可以通过变换矩阵进行转换,即表示为P c=TP w,代入公式(7),得到公式(8),(R,t)称为相机的外参,分别表示旋转和平移,这两个量是SLAM要估计的目标。将P投影到归一化平面Z=1上,可以得到归一化相机坐标P c′=[X/Z,Y/Z,1] T,归一化坐标经过内参之后就得到像素坐标。
然而针孔模型是理想模型,在实际应用中,由于相机透镜引起的切向畸变和径向畸变会导致光线穿过透镜投影到成像面时的位置发生变化,这里就是上文提到的需要标定的畸变参数。在外参标定环节,3D点和2D点的坐标获取方法类似,同样通过构建最小二乘问题进行优化求解。此外还需要分别计算相机光心在x、y、z轴上的坐标,该坐标可以通过工装的标准尺寸进行计算,需要注明的是外参的精度很大程度上依赖于工装本身较高的精度,即标定板到光心的距离,通过激光传感器获得了高精度的测量数据。
在前述实施方式中已经得到多组3D-2D点对关系,本环节是构建优化问题,通过最小化投影误差进行参数求解。其中内参标定求解,需要优化的变量有8个,分别是内参变量:焦距参数(fx,fy),光圈中心参数(cx,cy)和畸变参数变量(k1,k2,p1,p2),通过迭代求解,最终可以得到优化后的变量数值,同时计算出对应的最大投影误差和平均投影误差,用以衡量结果的精度。而对于外参的求解,涉及到旋转矩阵和平移矩阵的求解,在已经求解出内参的基础上,同样可以构建优化问题,最小化重投影误差,迭代求解外参变量。
目前非线性优化的主流方法是图优化,在阐述图优化前先来分析下优化方法三要素,目标函数、优化变量、约束条件,优化问题可以描述为:给定目标函数,在相应的约束条件下对待优化变量进行优化,最终得到最优解。图优化就是以图的形式来表示优化问题,构建的是3D到2D的对应关系,通过最小化重投影误差来进行迭代求解。其目标函数如下:
Figure PCTCN2021095821-appb-000009
其中e函数表示观测量zi与观测方程差值代表的误差,信息矩阵Ω是协方差矩阵的逆。该式便是图3的多个节点和边所构成的优化问题,为了求解该优化问题,需要一个初值和迭代方向,在每步的迭代中需要寻找一个增量Δx k,计算当前估计值的雅可比(Jacobian)矩阵和海塞(Hessian)矩阵。根据选取的梯度下降策略不同(即确定增量Δx k),比较有代表性的是G-N和L-M算法。这两者都是求解增量方程,如果在求解过程中Δx k足够小,则迭代停止,否则继续迭代。
作为可选的实施方式,方法还包括:
确定计算得到的内参数和外参数是否符合预设标准;
若符合预设标准,则将内参数和外参数写入存储器;
若不符合预设标准,则提供相应的提示信息。
在本实施例中,在程序设计的过程中,需要考虑到自动化,尽最大的可能减少人为参与的风险,所以基于此在程序中设计可以自主判定标定结果的好坏,不需要人为的判定,通过重投影误差来对结果进行判定,包括最大的投影误差、平均的投影误差都做了判定,保证最终的结果是可靠的,如果得到的投影误差结果超过设定的阈值,则故障检测会在终端界面显示结果不合理的提示,这样操作人员看到该提示后,会对机器做一些检测,然后尝试重新进行标定。
在具体实施例中,还包括对任务线程进行检测:对于该故障状态主要是防止操作人员多次发出进行标定的指令,由于该标定程序只是属于整个 机器运行程序的一部分,所以该部分的程序耗时需要得到有效的控制,所以在程序设计的时候就采用了异步方案,标定程序本身会主动构建一个异步线程进行计算,不会阻塞主线程的运行,而在开启的标定子线程没有运行结束之前,即使再次收到标定的指令,也不会再作出响应,故障检测会在终端界面显示任务忙,这样就可以告诉操作人员不需要重复进行操作,等待结果返回后再根据标定成功与否来进行下一步操作。
对文件写入状态进行检测:对与该状态主要是用来判定结果有效的写入到机器的存储器,在标定结束后,产生的标定结果会写入到指定的标定文件中,为了确保标定结果已经完整的写入到标定文件中,在写完后会再做一次读文件操作,如果读写都没问题,则认定标定成功。如果标定结果没有完整的写入到标定文件内,则故障检测会在终端界面显示文件写入失败,这样操作人员则会根据提示,再次进行标定。
作为可选的实施方式,控制相机获取标定件包括:
利用预设的照明部件为相机提供特定的光照条件;
向相机无线发送获取标定件图像的控制指令,控制相机获取标定件图像。
在本实施例中,标定板固定安装在一个半封闭的箱体中,在箱体内部设置一个光源,在相机拍摄时对标定板进行光照补偿,在排除外界的光线干扰的同时,也实现了该装置可以全天候工作。在具体实施例中,在半封闭的工装内部安装有两组灯带,不管外界的光线如何,装置都可以提供适宜的光线,从而得到可靠的待标定图像。
考虑到箱体是半封闭式的,因此通过蓝牙等无线方式控制相机对标记件进行拍摄,便于工作人员进行操作。
作为可选的实施方式,标定件包括具有棋盘格的标定板,N个标定板上的棋盘格所在的平面两两相交,对应的,图像特征点为图像中标定板的棋盘格角点,标定件特征点包括标定板的棋盘格角点。
在本实施例中,标定装置有四块棋盘格标定板,其中一块水平朝下, 其余三块分别倾斜设置,四块标定板所在平面两两相交且互不平行,四块标定板的棋盘格所在一面均朝向同一个方向,通过螺钉和支架将四块标定板进行固定。在具体实施例中,水平朝下的标定板以外其他三块标定板均至少安装一个激光传感器,通过激光传感器检测顶部设置有相机的机器是否水平的放置在定位板上,例如扫地机器人,其顶部平面为水平,待标定相机的镜头也是水平的。同时激光传感器获取能够确保相机光心到顶视棋盘格的每个角点的距离,为后续参数标定高精度的前提条件。
由于所述特定标定件是用于外参的标定,需要保证相机正面朝向其标定面,从而保证标定面各特征点到相机光心的距离准确,对于自动移动设备,就需要保证相机安装面与特定标定件,一方面可以利用与特定标定件平行的定位板,初步保证相机安装面与特定标定件的标定面平行。另一方面,还需要采取相应的检测手段,确定相机安装面是否确实平行于所述特定标定件的标定面。
通过激光传感器检测顶部设置有相机的机器是否水平的放置在定位板上,同时激光传感器获取能够确保相机光心到顶视棋盘格的每个角点的距离,为后续参数标定高精度的前提条件。采用针孔相机模型实现角点的2D-3D坐标转换,表示将三维空间点映射到二维的成像平面上,激光传感器获取的角点到相机光心距离为针孔相机模型提供了一个高精度的焦距数据,从而使得通过一张图像即可实现内、外参数的标定,并且标定过程中的坐标转换等计算耗费时间短,极大的减小了对操作人员的要求,节省了时间,提高了标定的鲁棒性和精度。
基于上述图1至图3对应的各实施例所提供的方法,本申请还提供一种相机参数标定装置。所述装置可以应用于执行上述方法,可以应用于自动移动设备的相机标定。图4是本公开提供的一种相机参数标定装置的一种实施例的模块结构示意图。具体的,如图4所示,所述装置可以包括:
标定模组401,包括N个标定件,N≥3;
相机定位单元402,用于将相机403定位于特定位置,使相机403与N 个标定件满足特定位置关系;
控制单元404,用于控制相机403获取标定件图像,每幅标定件图像包含N个标定件的图像;
处理单元405,被配置为根据标定件图像中的图像特征点的位置数据,和标定件的标定件特征点的位置数据,计算得到相机的内参数和外参数。
在本实施例中,如图5-图11所示,标定模组还包括半封闭箱体1,例如,标定件为标定板,例如,标定板为四个,即上述的N为4,四个标定板分别为第一标定板2,第二标定板3,第三标定板4,第四标定板5,至少四个激光传感器6和标定模块,其中:半封闭箱体1的较宽的一个侧面底部设有开口;第一标定板2水平固定在半封闭箱体1的顶部;第二标定板3和第三标定板4分别固定在半封闭箱体1设有开口侧面的相邻两侧;第四标定板5固定在半封闭箱体1设有开口侧面的相对一侧;至少四个激光传感器6分别设置在第一标定板2、第二标定板3、第三标定板4和第四标定板5的底端顶点处;标定模块,与至少四个激光传感器6通过通信连接;其中,第一标定板2、第二标定板3、第三标定板4和第四标定板5所在平面两两相交,并且互不平行。
优选地,第一标定板2,水平安装在半封闭箱体1的顶部,其标定面位于定位板7的正上方且与定位板平行;第二标定板3、第三标定板4、第四标定板5,分别安装在半封闭箱体1的侧面,且其标定面均倾斜朝向定位板7。
标定模块设置在待标定相机所在的机器上,通过无线网络或蓝牙与至少四个激光传感器连接。标定模块在待标定相机的图像进行处理计算,结合从激光传感器接收到的各标定板到待标定相机光心的距离信息,对相机的参数进行标定。标定模块还受外部蓝牙信号控制,开启标定线程。外部蓝牙信号可通过蓝牙开关产生,发送给标定模块。
如图7所示,标定模组还包括固定在半封闭箱体1设有开口一侧的底板上方的定位槽7,定位槽7的形状与机器底部的轮廓一致。第一横向刻度 线8和第二纵向刻度线9,设置在定位槽1下方,并且第一横向刻度线8和第二纵向刻度线9互相垂直,第一横向刻度线8和第二纵向刻度线9的交点经过定位槽7的中心。
如图8所示,标定模组还包括第一支撑板10,通过第一支架11水平固定在半封闭箱体1中;第二支撑板12,通过第二支架13水平固定在半封闭箱体1中,第一支撑板10与第二支撑板12位于同一高度,并且第一支撑板10与第二支撑板12平行于半封闭箱体1一个侧边;第三支撑板14,通过第一支撑块26和第二支撑块27固定在第一支撑板10和第二支撑板12的边缘一侧;第二标定板3、第三标定板4和第四标定板5分别倾斜固定在第一支撑板10、第二支撑板12和第三支撑板14上。第三支撑板14通过螺钉和第一支撑块26和第二支撑块27固定在第一支撑板10和第二支撑板12的边缘一侧。支撑板上设有卡槽,标定板能直接通过卡槽固定在支撑板上。
如图9所示,标定模组还包括:第一垂直连接件15,通过至少两个螺钉平行固定在第一支撑板10的下方;第二垂直连接件16,通过至少两个螺钉水平固定在第一垂直连接件15远离半封闭箱体1内壁一侧;可旋转连接件17,通过销轴18与第二垂直连接件16连接;第一卡槽板19,其背面与可旋转连接件17固定,用于放置第二标定板3。其中,第一卡槽板19背面粘贴T字型连接件,与第二垂直连接件16连接。销轴18可以在可旋转连接件17的滑轨中滑动,从而改变第二标定板3的倾斜角度。在具体实施例中,其余两块标定板也通过上述结构固定在相应的支撑板上。
如图5和图10所示,标定模组还包括:第四支撑板20,通过四个支撑柱21固定在第一支撑板10和第二支撑板12的上方,并且第四支撑板20高于第三支撑板;第二卡槽板22,其背面与第四支撑板20通过紧固件连接,第二卡槽板22位于第四支撑板20下方,用于放置第一标定板2。第四支撑板20面向第二卡槽板22一侧设置有第二横向刻度线23和第二纵向刻度线24;第二横向刻度线23和第二纵向刻度线24的交点经过第四支撑板20的中心。如图10和图11所示,第二横向刻度线23和第二纵向刻度线24超 出第一标定板在第四支撑板竖直方向上的投影,便于工作人员安装。
如图5所示,标定模组还包括设置在半封闭箱体1底部相邻开口任一侧或相对开口一侧的灯带25,通过设置灯带,在标定过程中对图像进行补光,提高标定图像的合格率。
例如,箱体1的箱口朝向定位板,标定件101安装在半封闭箱体1的内部,半封闭箱体预留自动移动设备放置与取出的窗口,同时给标定模组创造一个昏暗的环境,结合设置在半封闭箱体内灯带25,提供了标定过程中的照明。
作为可选的实施方式,相机定位单元,被配置为:
使N个标定件全部位于相机的视野范围内,以及使相机的光心与N个标定件中的一特定标定件的距离为预定距离,特定标定件为相机正对的标定件。
作为可选的实施方式,装置应用于包括相机的自动移动设备,对应的,相机定位单元包括:
定位板,用于放置自动移动设备,使N个标定件全部位于相机的视野范围内;
检测传感器,用于提供反应自动移动设备是否水平放置的检测信息。
优选地,上述的定位槽设在该定位板上,用于限制自动移动设备的移动,确保在标定过程中,自动移动设备的位置保持不变。
优选地,定位板设置在所述标定模组下方,用于放置所述自动移动设备,所述定位板与所述N个标定件满足特定位置关系。当然,定位板也可以设在标定模组的上方,只需二者相对设置即可。
为表述清楚,以下以定位板设在标定模组下方为例来说明具体结构,本例中,所述N个标定件中有一特定标定件2位于所述定位板的正上方,对应的,所述定位板与所述N个标定件满足特定位置关系可以包括:所述定位板与所述特定标定件2的标定面平行。
进一步的,本例中,所述定位板与所述N个标定件满足特定位置关系 包括:
所述N个标定件的标定面,均处于放置在所述定位板上的自动移动设备的相机的视野范围内。比如如图12所示,在自动移动设备放置在定位板上的状态下,四个标定板均处于其相机的视野范围内,即相机的可拍摄范围可以涵盖上述各标定板,从而只要在定位板上,相机就可以一次拍摄得到N个标定板的图像。
作为可选的实施方式,还包括:
检测单元,用于检测标定件图像中的图像特征点是否符合预设条件;
在检测单元检测到图像特征点符合预设条件的情况下,处理单元根据图像特征点的坐标和标定件特征点的坐标,计算得到相机的内参数和外参数。
本申请另一种实施例中,所述设备还可以包括测距组件,设置在所述定位板上方特定位置,用于测定与所述自动移动设备的相机安装面各处的距离。
通过测定相机安装面各点到测距组件的距离,可以检测相机安装面是否平行于标定板2,以及平行于定位板,比如检测安装面上三个点的距离,三点可以确定一个平面,就可以确定相机安装面是否与标定板2平行。本申请一种实施例中,如图12所示,所述测距组件可以包括四个激光测距传感器A、B、C、D,本例中,激光传感器A和B可以设置在第三标定板4的下端两个角点处,激光传感器C和D可以设置在第二标定板3的下端两个角点处。进一步的,激光传感器A、B、C、D位于同一平面,且A、B、C、D所在的同一平面与顶部的第一标定板2平行,这样,通过上述四个激光传感器可以分别检测其到扫地机器人上表面(即相机安装面)四个不同点的距离,根据四个激光测距传感器A、B、C、D测得的距离,就可以判断出相机安装面是否倾斜,即是否与定位板和第一标定板2平行,如果不平行,就可以通过传感器配置的报警器生成提示信息,如指示灯光信息或者声音指示信息等,以指示实施人员调整扫地机器人的放置位姿,使其相 机安装面平行于第一标定板2。当然,在本申请其他实施例中,激光传感器也可以设置在其他位置,激光传感器的数量只要大于等于3即可,本申请对此不作限定。
或者可以理解为:测距组件包括至少三个激光测距传感器,设置在定位板上方同一平面的不同位置,激光器的光束照射在相机安装面的不同位置,用于测量不同位置到同一平面的距离,以确定相机安装面是否与特定标定件的标定面平行,其中,同一平面与特定标定件的标定面平行。
通过设置在定位板对角线上以及标定板上的多个激光传感器,检测顶部设置有相机的自动清洁设备是否水平的放置在定位板上,确保其位置正确;同时激光传感器获取能够确保相机光心到顶视棋盘格的每个角点的距离,为后续内外参标定高精度的前提条件。
作为可选的实施方式,处理单元被配置为:根据标定件图像中,除特定标定件之外的N-1个标定件的图像特征点的坐标,和N-1个标定件的标定件特征点的坐标,计算得到相机的内参数;以及基于计算得到的内参数,根据特定标定件的图像特征点的坐标,和特定标定件的标定件特征点的坐标,计算得到相机的外参数;特定标定件为相机正对的标定件。
处理单元还用于对图像特征点的原始二维坐标进行迭代计算,得到图像特征点的亚像素精度的二维坐标。
处理单元还被配置为:利用多组相对应的图像特征点的二维坐标和标定件特征点的三维坐标,构建二维坐标与三维坐标的对应关系;利用多组相对应的二维坐标和三维坐标,对对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到内参数,内参数包括焦距参数、光圈中心参数、畸变参数;基于计算得到的内参数,根据多组相对应的特定标定件的图像特征点的二维坐标和标定件特征点的三维坐标,构建二维坐标与三维坐标的对应关系;利用多组相对应的二维坐标和三维坐标,对对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到外参数。
处理单元还被配置为:确定计算得到的内参数和外参数是否符合预设标准;输出带有不符合预设标准判断结果的提示信息。
装置还包括存储器,用于保存符合预设标准的内参数和外参数。
装置还包括照明部件,用于为相机提供特定的光照条件;控制指令发送单元和无线通信单元;控制指令发送单元向相机通过无线通信单元发送获取标定件图像的控制指令,控制指令发送单元用于控制相机获取标定件图像。
标定件包括具有棋盘格的标定板,N个标定板上的棋盘格所在的平面两两相交,对应的,图像特征点为图像中标定板的棋盘格角点,标定件特征点包括标定板的棋盘格角点。
所属领域的技术人员可以清楚地了解到,本公开各实施例可以互相参照,例如,为描述的方便和简洁,上述描述的装置和装置中的单元或模块的具体工作过程,可以参考前述方法实施例中的对应过程描述,此处不再赘述。
虽然结合附图描述了本申请的实施例,但是本领域技术人员可以在不脱离本申请的精神和范围的情况下作出各种修改和变型,这样的修改和变型均落入由所附权利要求所限定的范围之内。

Claims (25)

  1. 一种相机参数标定方法,其特征在于,包括:
    将相机设置于特定位置,使所述相机与N个标定件满足特定位置关系,N≥3;
    控制所述相机获取标定件图像,每幅所述标定件图像中包含所述N个标定件的图像;
    根据所述标定件图像中的图像特征点的位置数据,和所述标定件的标定件特征点的位置数据,计算得到所述相机的内参数和外参数。
  2. 如权利要求1所述的一种相机参数标定方法,其特征在于,将相机设置于特定位置,使所述相机与N个标定件满足特定位置关系包括:
    使所述N个标定件全部位于所述相机的视野范围内,以及使所述相机的光心与所述N个标定件中的一特定标定件的距离为预定距离,所述特定标定件为所述相机正对的标定件。
  3. 如权利要求2所述的一种相机参数标定方法,其特征在于,所述方法应用于包括所述相机的自动移动设备,对应的,使所述N个标定件全部位于所述相机的视野范围内,以及使所述相机的光心与所述N个标定件中的一特定标定件的距离为预定距离包括:
    将所述自动移动设备放置在预定的定位板上;
    根据所述定位板周边设置的激光传感器的检测信息,调整所述自动移动设备的位置使其水平放置于所述定位板上。
  4. 如权利要求1-3中任一项所述的一种相机参数标定方法,其特征在于,在所述根据所述标定件图像中的图像特征点的位置数据,和所述标定件的标定件特征点的位置数据,计算得到所述相机的内参数和外参数之前,还包括:
    检测所述标定件图像中的图像特征点是否符合预设条件;
    在所述图像特征点符合预设条件的情况下,根据所述图像特征点的坐标和所述标定件特征点的坐标,计算得到所述相机的内参数和外参数。
  5. 如权利要求4所述的一种相机参数标定方法,其特征在于,根据所述 图像特征点的坐标,和所述标定件特征点的坐标,计算得到所述相机的内参数和外参数包括:
    根据所述标定件图像中,除特定标定件之外的N-1个标定件的图像特征点的坐标,和所述N-1个标定件的标定件特征点的坐标,计算得到所述相机的内参数;
    基于计算得到的所述内参数,根据所述特定标定件的图像特征点的坐标,和所述特定标定件的标定件特征点的坐标,计算得到所述相机的外参数;所述特定标定件为所述相机正对的标定件。
  6. 如权利要求5所述的一种相机参数标定方法,其特征在于,所述图像特征点的坐标为所述图像特征点的亚像素精度的二维坐标,所述亚像素精度的二维坐标包括采用下述方式计算得到:
    对所述图像特征点的原始二维坐标进行迭代计算,得到所述图像特征点的亚像素精度的二维坐标。
  7. 如权利要求5所述的一种相机参数标定方法,其特征在于,所述计算得到所述相机的内参数包括:
    利用多组相对应的所述图像特征点的二维坐标和所述标定件特征点的三维坐标,构建所述二维坐标与所述三维坐标的对应关系;
    利用多组相对应的所述二维坐标和所述三维坐标,对所述对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到所述内参数,所述内参数包括焦距参数、光圈中心参数、畸变参数。
  8. 如权利要求5所述的一种相机参数标定方法,其特征在于,所述计算得到所述相机的外参数包括:
    基于计算得到的所述内参数,根据多组相对应的所述特定标定件的图像特征点的二维坐标和标定件特征点的三维坐标,构建所述二维坐标与所述三维坐标的对应关系;
    利用多组相对应的所述二维坐标和所述三维坐标,对所述对应关系的目标函数的参数进行迭代计算,直至重投影误差小于预设阈值,得到所述 外参数。
  9. 如权利要求1-3中任一项所述的一种相机参数标定方法,其特征在于,在根据所述标定件图像中的图像特征点的位置数据,和所述标定件的标定件特征点的位置数据,计算得到所述相机的内参数和外参数之后,还包括:
    确定计算得到的内参数和外参数是否符合预设标准;
    若符合预设标准,则将所述内参数和所述外参数写入存储器;
    若不符合预设标准,则提供相应的提示信息。
  10. 如权利要求1-3中任一项所述的一种相机参数标定方法,其特征在于,所述控制所述相机获取标定件包括:
    利用预设的照明部件为所述相机提供特定的光照条件;
    向所述相机无线发送获取标定件图像的控制指令,控制所述相机获取所述标定件图像。
  11. 如权利要求1-3中任一项所述的一种相机参数标定方法,其特征在于,所述标定件包括具有棋盘格的标定板,所述N个标定板上的棋盘格所在的平面两两相交,对应的,所述图像特征点为图像中所述标定板的棋盘格角点,所述标定件特征点包括所述标定板的棋盘格角点。
  12. 一种相机参数标定装置,其特征在于,所述装置包括:
    标定模组,包括N个标定件,N≥3;
    相机定位单元,用于将相机定位于特定位置,使所述相机与所述N个标定件满足特定位置关系;
    控制单元,用于控制所述相机获取所述标定件图像,每幅所述标定件图像包含所述N个标定件的图像;
    处理单元,被配置为根据所述标定件图像中的图像特征点的位置数据,和所述标定件的标定件特征点的位置数据,计算得到所述相机的内参数和外参数。
  13. 如权利要求12所述的一种相机参数标定装置,其特征在于,所述相机定位单元,被配置为:
    使所述N个标定件全部位于所述相机的视野范围内,以及使所述相机的光心与所述N个标定件中的一特定标定件的距离为预定距离,所述特定标定件为所述相机正对的标定件。
  14. 如权利要求13所述的一种相机参数标定装置,其特征在于,所述装置应用于包括所述相机的自动移动设备,对应的,所述相机定位单元包括:
    定位板,用于放置所述自动移动设备,使所述N个标定件全部位于所述相机的视野范围内;
    检测传感器,用于提供反应所述自动移动设备是否水平放置的检测信息。
  15. 如权利要求12-14中任一项所述的一种相机参数标定装置,其特征在于,还包括:
    检测单元,用于检测所述标定件图像中的图像特征点是否符合预设条件;
    在所述检测单元检测到所述图像特征点符合预设条件的情况下,所述处理单元根据所述图像特征点的坐标和所述标定件特征点的坐标,计算得到所述相机的内参数和外参数。
  16. 如权利要求15所述的一种相机参数标定装置,其特征在于,所述处理单元被配置为:根据所述标定件图像中,除特定标定件之外的N-1个标定件的图像特征点的坐标,和所述N-1个标定件的标定件特征点的坐标,计算得到所述相机的内参数;以及基于计算得到的所述内参数,根据所述特定标定件的图像特征点的坐标,和所述特定标定件的标定件特征点的坐标,计算得到所述相机的外参数;所述特定标定件为所述相机正对的标定件。
  17. 如权利要求14所述的一种相机参数标定装置,其特征在于,
    所述定位板与所述标定模组相对设置,所述N个标定件中有一特定标定件位于所述定位板的一侧正上方或下方,对应的,所述定位板与所述N个标定件满足特定位置关系包括:所述定位板与所述特定标定件的标定面 平行。
  18. 如权利要求17所述的一种相机参数标定装置,其特征在于,所述定位板与所述N个标定件满足特定位置关系包括:所述N个标定件的标定面,均处于放置在所述定位板上的自动移动设备的相机的视野范围内。
  19. 如权利要求17或18所述的一种相机参数标定装置,其特征在于,还包括测距组件,设置在所述定位板上方特定位置,用于测定与所述自动移动设备的相机安装面各处的距离。
  20. 如权利要求19所述的一种相机参数标定装置,其特征在于,所述测距组件包括至少三个激光测距传感器,设置在所述定位板上方同一平面的不同位置,所述激光器的光束照射在所述相机安装面的不同位置,用于测量所述不同位置到所述同一平面的距离,以确定所述相机安装面是否与所述特定标定件的标定面平行,其中,所述同一平面与所述特定标定件的标定面平行。
  21. 如权利要求17或18所述的一种相机参数标定装置,其特征在于,所述标定模组包括半封闭箱体,所述箱体的箱口朝向所述定位板,所述标定件安装在所述半封闭箱体的内部。
  22. 如权利要求21所述的一种相机参数标定装置,其特征在于,所述标定件为标定板,对应的,所述标定模组还包括:
    第一标定板,水平安装在所述半封闭箱体的顶部,其标定面位于所述定位板的正上方且与所述定位板平行;
    第二标定板、第三标定板、第四标定板,分别安装在所述半封闭箱体的侧面,且其标定面均倾斜朝向所述定位板。
  23. 如权利要求14或17或18所述的一种相机参数标定装置,其特征在于,所述定位板的上表面开设有定位槽,用于限制所述自动移动设备的移动。
  24. 如权利要求23所述的相机参数标定设备,其特征在于,所述相机定位单元还包括:第一横向刻度线和第一纵向刻度线,设置在所述定位板下 方,并且所述第一横向刻度线和所述第一纵向刻度线互相垂直,所述第一横向刻度线和第一所述纵向刻度线的交点经过所述定位槽的中心。
  25. 如权利要求21所述的一种相机参数标定装置,其特征在于,所述标定模组还包括:灯带,设置在所述半封闭箱体内,用于提供标定过程中的照明。
PCT/CN2021/095821 2020-05-25 2021-05-25 相机参数标定方法及装置 WO2021238923A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202010447821.5 2020-05-25
CN202020890263.5U CN212433821U (zh) 2020-05-25 2020-05-25 一种相机参数标定设备
CN202010447821.5A CN111612853B (zh) 2020-05-25 2020-05-25 相机参数标定方法及装置
CN202020890263.5 2020-05-25

Publications (1)

Publication Number Publication Date
WO2021238923A1 true WO2021238923A1 (zh) 2021-12-02

Family

ID=78722991

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/095821 WO2021238923A1 (zh) 2020-05-25 2021-05-25 相机参数标定方法及装置

Country Status (1)

Country Link
WO (1) WO2021238923A1 (zh)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114299157A (zh) * 2021-12-15 2022-04-08 苏州大学 一种托卡马克仓内立体相机延拓接力标定的方法及系统
CN114519747A (zh) * 2022-02-28 2022-05-20 嘉兴市像景智能装备有限公司 一种泛电子领域自动光学检测设备的标定方法
CN114800520A (zh) * 2022-05-23 2022-07-29 北京迁移科技有限公司 高精度的手眼标定方法
CN115070779A (zh) * 2022-08-22 2022-09-20 菲特(天津)检测技术有限公司 机器人抓取控制方法、系统及电子设备
CN115690226A (zh) * 2022-10-27 2023-02-03 合肥中科君达视界技术股份有限公司 一种基于Scheimpflug定律的大视场3D轮廓测量仪标定方法
CN116182702A (zh) * 2023-01-31 2023-05-30 桂林电子科技大学 一种基于主成分分析的线结构光传感器标定方法及系统
CN116563391A (zh) * 2023-05-16 2023-08-08 深圳市高素科技有限公司 一种基于机器视觉的激光结构自动标定方法
EP4224199A3 (en) * 2022-01-12 2023-11-15 Willand (Beijing) Technology Co., Ltd. Method for calibrating lawnmower, electronic device, storage medium and lawnmower
CN117911541A (zh) * 2024-03-19 2024-04-19 杭州灵西机器人智能科技有限公司 沙姆相机标定方法、装置和系统
CN117911540A (zh) * 2024-03-18 2024-04-19 安徽大学 一种用于事件相机标定装置与方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106846415A (zh) * 2017-01-24 2017-06-13 长沙全度影像科技有限公司 一种多路鱼眼相机双目标定装置及方法
CN109215082A (zh) * 2017-06-30 2019-01-15 杭州海康威视数字技术股份有限公司 一种相机参数标定方法、装置、设备及系统
US20190073795A1 (en) * 2016-05-13 2019-03-07 Olympus Corporation Calibration device, calibration method, optical device, photographing device, projecting device, measurement system, and measurement method
CN110599548A (zh) * 2019-09-02 2019-12-20 Oppo广东移动通信有限公司 摄像头的标定方法、装置、相机及计算机可读存储介质
CN111612853A (zh) * 2020-05-25 2020-09-01 追创科技(苏州)有限公司 相机参数标定方法及装置
CN212433821U (zh) * 2020-05-25 2021-01-29 追创科技(苏州)有限公司 一种相机参数标定设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190073795A1 (en) * 2016-05-13 2019-03-07 Olympus Corporation Calibration device, calibration method, optical device, photographing device, projecting device, measurement system, and measurement method
CN106846415A (zh) * 2017-01-24 2017-06-13 长沙全度影像科技有限公司 一种多路鱼眼相机双目标定装置及方法
CN109215082A (zh) * 2017-06-30 2019-01-15 杭州海康威视数字技术股份有限公司 一种相机参数标定方法、装置、设备及系统
CN110599548A (zh) * 2019-09-02 2019-12-20 Oppo广东移动通信有限公司 摄像头的标定方法、装置、相机及计算机可读存储介质
CN111612853A (zh) * 2020-05-25 2020-09-01 追创科技(苏州)有限公司 相机参数标定方法及装置
CN212433821U (zh) * 2020-05-25 2021-01-29 追创科技(苏州)有限公司 一种相机参数标定设备

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114299157A (zh) * 2021-12-15 2022-04-08 苏州大学 一种托卡马克仓内立体相机延拓接力标定的方法及系统
CN114299157B (zh) * 2021-12-15 2022-11-08 苏州大学 一种托卡马克仓内立体相机延拓接力标定的方法及系统
EP4224199A3 (en) * 2022-01-12 2023-11-15 Willand (Beijing) Technology Co., Ltd. Method for calibrating lawnmower, electronic device, storage medium and lawnmower
CN114519747A (zh) * 2022-02-28 2022-05-20 嘉兴市像景智能装备有限公司 一种泛电子领域自动光学检测设备的标定方法
CN114519747B (zh) * 2022-02-28 2024-02-09 嘉兴市像景智能装备有限公司 一种泛电子领域自动光学检测设备的标定方法
CN114800520A (zh) * 2022-05-23 2022-07-29 北京迁移科技有限公司 高精度的手眼标定方法
CN114800520B (zh) * 2022-05-23 2024-01-23 北京迁移科技有限公司 高精度的手眼标定方法
CN115070779B (zh) * 2022-08-22 2023-03-24 菲特(天津)检测技术有限公司 机器人抓取控制方法、系统及电子设备
CN115070779A (zh) * 2022-08-22 2022-09-20 菲特(天津)检测技术有限公司 机器人抓取控制方法、系统及电子设备
CN115690226A (zh) * 2022-10-27 2023-02-03 合肥中科君达视界技术股份有限公司 一种基于Scheimpflug定律的大视场3D轮廓测量仪标定方法
CN115690226B (zh) * 2022-10-27 2024-02-13 合肥中科君达视界技术股份有限公司 一种基于Scheimpflug定律的大视场3D轮廓测量仪标定方法
CN116182702A (zh) * 2023-01-31 2023-05-30 桂林电子科技大学 一种基于主成分分析的线结构光传感器标定方法及系统
CN116182702B (zh) * 2023-01-31 2023-10-03 桂林电子科技大学 一种基于主成分分析的线结构光传感器标定方法及系统
CN116563391A (zh) * 2023-05-16 2023-08-08 深圳市高素科技有限公司 一种基于机器视觉的激光结构自动标定方法
CN116563391B (zh) * 2023-05-16 2024-02-02 深圳市高素科技有限公司 一种基于机器视觉的激光结构自动标定方法
CN117911540A (zh) * 2024-03-18 2024-04-19 安徽大学 一种用于事件相机标定装置与方法
CN117911541A (zh) * 2024-03-19 2024-04-19 杭州灵西机器人智能科技有限公司 沙姆相机标定方法、装置和系统

Similar Documents

Publication Publication Date Title
WO2021238923A1 (zh) 相机参数标定方法及装置
CN111612853B (zh) 相机参数标定方法及装置
US11544874B2 (en) System and method for calibration of machine vision cameras along at least three discrete planes
CN109029299B (zh) 舱段销孔对接转角的双相机测量装置及测量方法
US9488589B2 (en) Mapping damaged regions on objects
US20120148145A1 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
EP2104365A1 (en) Method and apparatus for rapid three-dimensional restoration
CN110926330B (zh) 图像处理装置和图像处理方法
CN113324478A (zh) 一种线结构光的中心提取方法及锻件三维测量方法
US10819972B2 (en) Method and apparatus for light and computer vision based dimensional metrology and 3D reconstruction
CN112767338A (zh) 一种基于双目视觉的装配式桥梁预制构件吊装定位系统及其方法
FI129042B (en) Computer vision system with a computer-generated virtual reference object
CN115187612A (zh) 一种基于机器视觉的平面面积测量方法、装置及系统
Williams et al. Automatic image alignment for 3D environment modeling
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
JP6906177B2 (ja) 交点検出装置、カメラ校正システム、交点検出方法、カメラ校正方法、プログラムおよび記録媒体
CN116147477A (zh) 联合标定方法、孔位检测方法、电子设备以及存储介质
CN111915666A (zh) 基于移动终端的体积测量方法及装置
CN115307865A (zh) 一种面向高温高超声速流场的模型变形测量方法
WO2005073669A1 (en) Semi and fully-automatic camera calibration tools using laser-based measurement devices
CN117115233B (zh) 一种基于机器视觉的尺寸测量方法、装置及电子设备
CN113781581B (zh) 基于靶标松姿态约束的景深畸变模型标定方法
JP4196784B2 (ja) カメラ位置測定装置および方法並びにカメラ位置制御方法
CN117372532A (zh) 全景影像标定装置、系统、方法、计算机系统及存储介质
CN118135023A (zh) 一种相机拍摄图像相对于扫描仪坐标系的标定方法、记录媒体和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21813384

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21813384

Country of ref document: EP

Kind code of ref document: A1