WO2023284349A1 - 3d camera calibration method and apparatus, and calibration system - Google Patents

3d camera calibration method and apparatus, and calibration system Download PDF

Info

Publication number
WO2023284349A1
WO2023284349A1 PCT/CN2022/087716 CN2022087716W WO2023284349A1 WO 2023284349 A1 WO2023284349 A1 WO 2023284349A1 CN 2022087716 W CN2022087716 W CN 2022087716W WO 2023284349 A1 WO2023284349 A1 WO 2023284349A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
camera
coordinates
point cloud
scanned
Prior art date
Application number
PCT/CN2022/087716
Other languages
French (fr)
Chinese (zh)
Inventor
胡显东
缪丰
贾仕勇
Original Assignee
无锡先导智能装备股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 无锡先导智能装备股份有限公司 filed Critical 无锡先导智能装备股份有限公司
Publication of WO2023284349A1 publication Critical patent/WO2023284349A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present application relates to the technical field of machine vision, in particular to a 3D camera calibration method, device, and calibration system.
  • 3D cameras that can take stereoscopic images have appeared.
  • two 3D cameras are sometimes used for image processing after shooting.
  • the cameras need to be calibrated to obtain the coordinate transformation relationship of the two cameras.
  • a common field of view is usually required, and camera calibration is performed based on the common field of view.
  • other electronic tools such as a laser rangefinder, are usually required for calibration, and the operation is complicated.
  • a 3D camera calibration method comprising:
  • the scanned point cloud of the first camera to be calibrated and the second camera is a point cloud generated by scanning the calibration hole on the calibration object;
  • coordinate conversion relationship data between the first camera and the second camera is obtained by calculation.
  • the extracting the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain the first coordinates includes:
  • the obtaining the normal vector of the plane where the calibration hole scanned by the first camera includes:
  • At least 3 non-collinear calibration holes are provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond;
  • the scanning point cloud image of the first camera is the first A point cloud image generated by a camera scanning a calibration hole on a surface of the calibration object, and a point cloud image scanned by the second camera is a point cloud image generated by the second camera scanning a calibration hole on the other surface of the calibration object.
  • the calibration hole distance data includes the calibration object width, and the calibration object width is the width between the two opposite surfaces of the calibration object on which the calibration hole is set; according to the first coordinates, the The normal vector and the preset calibration hole distance data are used to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, including:
  • the extension coordinates of the calibration hole scanned by the second camera in the first camera are estimated.
  • two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes of the same row are in a straight line, and the positions of the calibration holes of the two rows correspond one by one, and the straight line where the calibration holes of the same row are located is the same as that of the two rows.
  • the line between the two calibration holes corresponding to the middle position is vertical;
  • the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface
  • the scanning point cloud image of the second camera is the scanning point cloud image of the calibration hole scanned by the second camera.
  • the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; Hole distance data, calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera, including:
  • the normal vector, the straight line equation and the hole spacing, the extension coordinates of the calibration holes scanned by the second camera in the first camera are calculated.
  • a 3D camera calibration device comprising:
  • the image acquisition module is used to obtain the scanned point cloud images of the first camera and the second camera to be calibrated, and the scanned point cloud images are point cloud images generated by scanning the calibration holes on the calibration object;
  • a coordinate extraction module configured to extract the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain first coordinates, and extract the center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain the second coordinate;
  • a normal vector obtaining module configured to obtain the normal vector of the plane where the calibration hole scanned by the first camera is located
  • a coordinate mapping module configured to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the preset calibration hole distance data
  • a relationship acquiring module configured to calculate coordinate conversion relationship data between the first camera and the second camera according to the second coordinates and the extended coordinates.
  • a calibration system comprising:
  • the first camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the first camera;
  • the second camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the second camera;
  • a controller the controller is connected to the first camera and the second camera, and includes a memory and a processor, the memory stores a computer program, and the processor implements the following steps when executing the computer program:
  • the scanned point cloud of the first camera to be calibrated and the second camera is a point cloud generated by scanning the calibration hole on the calibration object;
  • coordinate conversion relationship data between the first camera and the second camera is obtained by calculation.
  • the above-mentioned calibration system also includes the calibration object
  • At least 3 non-collinear calibration holes are provided on the opposite two surfaces of the calibration object, and the positions of the calibration holes on the opposite two surfaces correspond; or two rows of calibration holes are set on the calibration surface of the calibration object, and the calibration holes in the same row Form a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the line where the calibration holes in the same row are located is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows.
  • the above-mentioned 3D camera calibration method, device and calibration system extract the first coordinates of the calibration hole scanned by the first camera in the first camera through the scanned point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and according to the second
  • the camera scans the scanned point cloud image generated by the calibration hole on the calibration object, extracts the second coordinate of the calibration hole scanned by the second camera in the second camera, and then according to the first coordinate and the normal vector of the plane where the calibration hole scanned by the first camera
  • the calibration hole distance data calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, and finally calculate the corresponding relationship based on the second coordinates and the extension coordinates, and obtain the two coordinate systems of the first camera and the second camera Convert relational data; in this way, it can be applied to the calibration of two cameras that do not have a common field of view without relying on the common field of view. Only one calibration object can be used to calibrate two cameras simply and efficiently.
  • Fig. 1 is a schematic flow chart of a 3D camera calibration method in an embodiment
  • Fig. 2 is a schematic diagram of the situation of two cameras scanning in one embodiment
  • Fig. 3 is a schematic diagram of two points on opposite sides on the calibration object in one embodiment
  • Figure 4 is a schematic diagram of the calibration process of two cameras scanning
  • Fig. 5 is a schematic diagram of the situation of scanning stitching of two cameras in one embodiment
  • Figure 6 is a schematic diagram of the calibration process of two-camera splicing scanning
  • Fig. 7 is a structural block diagram of a 3D camera calibration device in an embodiment.
  • connection in the following embodiments should be understood as “electrical connection”, “communication connection” and the like if there is transmission of electrical signals or data between the connected objects.
  • a 3D camera calibration method is provided, which can be applied to a device used for calibration processing, such as a controller. Taking the application to the controller as an example, the method includes the following steps:
  • S110 Obtain scanned point cloud images of the first camera and the second camera to be calibrated, where the scanned point cloud images are point cloud images generated by scanning the calibration holes on the calibration object.
  • the first camera and the second camera are 3D cameras that need to be calibrated.
  • the calibration object is an object used for calibrating the first camera and the second camera, such as a wooden block, and holes are opened on the calibration object as calibration holes, and the number of calibration holes is multiple.
  • the first camera scans a part of the calibration holes on the calibration object
  • the second camera scans another part of the calibration holes on the calibration object, that is, the calibration holes scanned by the first camera and the second camera are different, and the first camera and the second camera have no public view.
  • the calibration hole may be a round hole, or a hole of other shapes.
  • the first camera and the second camera are used to scan the calibration hole on the calibration object to generate scanned point cloud images; the controller acquires the scanned point cloud images of the first camera and the scanned point cloud images of the second camera.
  • the coordinates of the center of the calibration hole are the coordinates of the center of the indexed hole.
  • the coordinates of the center of the calibration hole are the coordinates of the center of the circle.
  • the first coordinate is the coordinate value of the calibration hole scanned by the first camera in the coordinate system of the first camera
  • the second coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the second camera.
  • the first coordinate and the second coordinate are three-dimensional coordinate values.
  • the first camera scans a plurality of calibration holes, and for each calibration hole in the scanned point cloud image of the first camera, the coordinates of the center of the hole are obtained to obtain the corresponding first coordinates, that is, based on the scanned point cloud image of the first camera, it can be obtained Multiple first coordinates.
  • multiple second coordinates can be obtained based on the point cloud image scanned by the second camera.
  • the calibration object has multiple surfaces, and the calibration holes scanned by the same camera are located on the same surface of the calibration object, for example, the calibration holes scanned by the first camera are located on the same surface of the calibration object.
  • S170 Estimate the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the preset calibration hole distance data.
  • the calibration hole distance data represents the distance between the calibration hole scanned by the first camera and the calibration hole scanned by the second camera.
  • the extended coordinate refers to the coordinate value of the position of the calibration hole scanned by the second camera in the coordinate system of the first camera. According to the first coordinates, normal vector and calibration hole distance data corresponding to the first camera, the extended coordinates of the calibration hole scanned by the second camera in the first camera are calculated, that is, the calibration hole scanned by the second camera is mapped to the first Coordinate values in the camera's coordinate system.
  • S190 According to the second coordinates and the extended coordinates, calculate and obtain the coordinate transformation relationship data between the first camera and the second camera.
  • the second coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the second camera
  • the extended coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the first camera; then, the same
  • the second coordinate and the extension coordinate corresponding to a calibration hole are coordinate values in different coordinate systems.
  • the transformation relation data is data representing a transformation relation between the coordinates of the first camera and the coordinates of the second camera, and may specifically be a matrix.
  • the second coordinates and extension coordinates are three-dimensional coordinate values, and the second coordinates of each calibration hole correspond to a row of the matrix, based on the second coordinates of multiple calibration holes Obtain an original matrix; the extension coordinates of each calibration hole correspond to a row of another matrix, and obtain an extension matrix based on the extension coordinates of multiple calibration holes, and then use mathematical algorithms to calculate the transformation from the original matrix to the extension matrix or from the extension matrix Transform the data needed to transform to the original matrix to get the transformation matrix.
  • the above-mentioned 3D camera calibration method extracts the first coordinates of the calibration hole scanned by the first camera in the first camera from the scanned point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and scans the calibration hole on the calibration object according to the second camera.
  • step S130 extracting the coordinates of the center of the calibration hole in the scanning point cloud image of the first camera to obtain the first coordinates includes: extracting the edge data points of the calibration hole in the scanning point cloud image of the first camera; Data points are extracted from the center coordinates of the calibration holes to obtain the first coordinates.
  • Extract the edge data points of the calibration hole that is, determine the edge of the calibration hole.
  • the accuracy is high.
  • the gradient method is used to extract the edge data points of the calibration hole, and the edge can be accurately determined; the center point coordinates of the calibration hole can be accurately obtained by using the space center fitting algorithm to extract the center point of the calibration hole.
  • the edge data points of the circular hole are extracted using the gradient method, and the center coordinates of the circular hole are extracted using the spatial center fitting algorithm.
  • step S130 the step of extracting the hole center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain the second coordinates is the same as the specific step of extracting the hole center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain the first coordinates The same, will not be repeated here.
  • step S150 includes: selecting at least 3 non-collinear points on the surface of the calibration hole scanned by the first camera in the calibration object; determining the plane according to the selected points; calculating the normal vector of the plane .
  • the 3 non-collinear points can determine a plane, and then calculate the normal vector of the selected plane.
  • the determined at least three non-collinear points are points outside the calibration hole, which are on the same plane as the calibration hole.
  • the first camera scans the calibration hole on the left side of the calibration object, and selects three non-collinear points besides the calibration hole on the left side of the calibration object to determine the plane and obtain the normal vector.
  • the normal vector of the plane where the calibration hole is located can be extracted through a plane fitting algorithm.
  • At least 3 non-collinear calibration holes are respectively provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond;
  • the point cloud image generated by the surface calibration hole, the point cloud image scanned by the second camera is the point cloud image generated by the second camera scanning the calibration hole on the other surface of the calibration object.
  • the calibration object is a standard part, preferably an ideal cuboid.
  • the opposite sides of the calibration object include the left side and the right side, the front side and the rear side, the upper side and the lower side.
  • the positions correspond to each other.
  • the first camera scans the left side and the second camera scans the right side. In this way, this embodiment is applicable to the situation where two cameras scan each other.
  • the two cameras are set on the left and right sides of the calibration object, and on the left and right sides of the calibration object, several high-precision round holes (at least 3, and not on a line) are punched, and ensure that the planes on both sides
  • the number of round holes is equal, and the positions of the round holes correspond to each other.
  • They can be pierced.
  • the first camera (camera 1) scans each calibration hole on the left plane
  • the second camera (camera 2) scans each calibration hole on the right plane.
  • the scanning direction does not need to be parallel to the surface of the calibration object.
  • the distance data of the calibration hole includes the width of the calibration object, wherein the width of the calibration object is the width between two opposite surfaces of the calibration object on which the calibration hole is formed.
  • Step S170 includes: calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the width of the calibration object.
  • the second The coordinates of the calibration hole scanned by the camera in the coordinate system of the first camera.
  • the coordinates can be calculated according to a geometric algorithm.
  • the field of view of the first camera on the left side of the scanning calibration object is taken as the reference system
  • the coordinates of point A on the left side of the calibration object are known
  • the left and right sides of the calibration object The thickness d between them is known
  • the normal vector is known
  • the straight line equation passing through point A can be determined according to the normal vector and the coordinates of point A, and then according to the straight line equation, the coordinates of point A and the width of the calibration object (point A and point B The distance between) calculate the coordinates of point B on the right side to obtain the extended coordinates.
  • FIG. 4 it is a schematic diagram of the calibration process of two-camera scanning. After the calibration program is started, camera 1 and camera 2 scan the calibration object to generate a point cloud image, and then use the obtained point cloud image to fit N circles on the left plane respectively.
  • the coordinate processing in this application is completed in three-dimensional space, and is not sensitive to factors such as camera position and scanning direction, so this application has better robustness and higher precision in practical applications.
  • two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the straight line where the calibration holes are in the same row is in line with the position in the two rows.
  • the connecting line between the corresponding two calibration holes is vertical.
  • the positions of the two rows of calibration holes are opposite, one-to-one, and equal in number, that is, the first calibration hole in the first row corresponds to the position of the first calibration hole in the second row, and the second calibration hole in the first row corresponds to the position of the first calibration hole in the second row.
  • the hole corresponds to the position of the second calibration hole in the second row, and so on.
  • the corresponding two calibration holes in the two rows include the first calibration hole in the first row, the first calibration hole in the second row, and the second calibration hole in the first row.
  • Calibration hole and the second calibration hole in the second row Taking the calibration surface as an example, two rows of calibration holes can be arranged on two opposite edges of the calibration surface.
  • the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface
  • the scanning point cloud image of the second camera is generated by the second camera scanning another row of calibration holes on the calibration surface point cloud image.
  • This embodiment is applicable to the case where two cameras are spliced and scanned, that is, the two cameras scan the two sides of the same surface of the object, and the two sides cannot be directly spliced because there is no common field of view. It should be noted that this application is used for 3D cameras, and the above-mentioned splicing is not only the splicing of the left and right, but also the case that the heights may not be equal.
  • one surface of the calibration object is used as the calibration surface, and the two sides of the calibration surface are used as the area to be detected.
  • a row of calibration holes is made respectively.
  • the center of each row of calibration holes must be on the same straight line.
  • the number of calibration holes in the two rows Equal, one-to-one correspondence of positions, and it is necessary to ensure that the straight line where the same row of calibration holes is located is perpendicular to the two pairs of calibration holes corresponding to the positions in different rows. For example, as shown in Figure 5, two rows of round holes are set on the upper surface of the calibration object.
  • the first The camera scans the first row of calibration holes
  • the second camera scans the second row of calibration holes
  • the transformation matrix of the two camera coordinate systems which can be used to stitch images or other image joint processing results.
  • the splicing implementation does not require the field of view of the camera, that is, different images do not need to contain the common field of view, and the field of view of the 3D camera used only needs to cover the detection area. There is no need to select a large field of view camera to include the public field of view, which can ensure Detection accuracy.
  • the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; for example, the hole spacing can be equal to the first calibration hole in the first row and the second row.
  • Step S170 includes: according to the first coordinates of the plurality of calibration holes scanned by the first camera, fitting the straight line equation of the straight line where the calibration holes scanned by the first camera are located; according to the first coordinates, the normal vector, the straight line equation and the hole spacing, Estimate the extended coordinates of the calibration hole scanned by the second camera in the first camera.
  • the first coordinate is used to fit the straight line equation
  • the first coordinate, normal vector, straight line equation and the hole spacing between the corresponding two calibration holes are used to obtain the calibration of the second camera scan
  • the coordinates of the hole in the first camera's coordinate system can be calculated according to geometrical algorithms. For example, in Figure 5, point C is located in the first row, and point D is located in the second row.
  • the straight line passing through point D can be uniquely determined, so that it can be obtained Obtain the equation of the straight line where the two points of CD are located, and then calculate the coordinates of point D to obtain the extension coordinates according to the equation of the straight line of the two points CD and the coordinates of point C; similarly, the extension coordinates of other calibration holes in the second row can be calculated.
  • FIG. 6 it is a schematic diagram of the calibration process of spliced scanning in one embodiment.
  • steps in the flow charts of FIG. 1 , FIG. 4 , and FIG. 6 are shown sequentially as indicated by the arrows, these steps are not necessarily executed sequentially in the order indicated by the arrows. Unless otherwise specified herein, there is no strict order restriction on the execution of these steps, and these steps can be executed in other orders. Moreover, at least some of the steps in FIG. 1, FIG. 4, and FIG. 6 may include multiple steps or multiple stages, and these steps or stages are not necessarily performed at the same time, but may be performed at different times. These steps Or the execution sequence of the stages is not necessarily performed sequentially, but may be executed in turn or alternately with other steps or at least a part of steps or stages in other steps.
  • a 3D camera calibration device including: a map acquisition module 710, a coordinate extraction module 730, a normal vector acquisition module 750, a coordinate mapping module 770 and a relationship acquisition module 790, wherein :
  • the image acquisition module 710 is used to obtain the scanned point cloud images of the first camera to be calibrated and the second camera, and the scanned point cloud image is a point cloud image generated by scanning the calibration hole on the calibration object;
  • the coordinate extraction module 730 is used to extract the first camera's Scanning the center coordinates of the calibration hole in the point cloud image to obtain the first coordinates, extracting the center coordinates of the calibration hole in the scanning point cloud image of the second camera to obtain the second coordinates;
  • the coordinate mapping module 770 is used to calculate the extended coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the preset calibration hole distance data;
  • relationship acquisition The module 790 is used to calculate the coordinate transformation relationship data between the first camera and the second camera according to the second coordinates and the extended coordinates.
  • the above-mentioned 3D camera calibration device extracts the first coordinates of the calibration hole scanned by the first camera in the first camera by scanning the point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and scans the calibration hole on the calibration object with the second camera.
  • the coordinate extraction module 730 is used to extract the edge data points of the calibration hole in the scanned point cloud image of the first camera; extract the center of the calibration hole according to the edge data points of the calibration hole in the scanned point cloud image of the first camera Coordinates to obtain the first coordinates; extract the edge data points of the calibration holes in the scanned point cloud image of the second camera; extract the center coordinates of the calibration holes according to the edge data points of the calibration holes in the scanned point cloud image of the second camera to obtain the second coordinates.
  • the normal vector acquisition module 750 selects at least 3 non-collinear points on the surface where the calibration hole scanned by the first camera in the calibration object is located; the plane is determined according to the selected points; the method for obtaining the plane vector.
  • At least 3 non-collinear calibration holes are respectively provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond;
  • the point cloud image generated by the surface calibration hole, the point cloud image scanned by the second camera is the point cloud image generated by the second camera scanning the calibration hole on the other surface of the calibration object.
  • the calibration hole distance data includes the calibration object width, wherein the calibration object width is the width between two opposite surfaces of the calibration object on which the calibration hole is set.
  • the coordinate mapping module 770 calculates the extended coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the width of the calibration object.
  • two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the straight line where the calibration holes are in the same row is in line with the position in the two rows.
  • the line between the corresponding two calibration holes is vertical;
  • the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface, and the scanning point cloud image of the second camera is the scanning calibration image of the second camera.
  • the point cloud image generated by another row of calibration holes on the surface is vertical;
  • the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; the coordinate mapping module 770 fits the first coordinates of the multiple calibration holes scanned by the first camera.
  • the linear equation of the straight line where the calibration hole scanned by the camera is located; according to the first coordinate, the normal vector, the linear equation and the hole spacing, the extended coordinates of the calibration hole scanned by the second camera in the first camera are calculated.
  • Each module in the above-mentioned 3D camera calibration device can be fully or partially realized by software, hardware and a combination thereof.
  • the above-mentioned modules can be embedded in or independent of the processor in the controller in the form of hardware, and can also be stored in the memory of the controller in the form of software, so that the processor can invoke and execute the corresponding operations of the above-mentioned modules.
  • the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • a calibration system including a first camera, a second camera and a controller.
  • the first camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the first camera
  • the second camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the second camera
  • the controller is connected to the first camera and the second camera, and includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
  • the calibration holes scanned by the first camera and the second camera are different.
  • the above-mentioned calibration system includes a controller capable of implementing the steps of the above-mentioned method embodiments, similarly, it does not need to rely on the public field of view, and can simply and efficiently realize the calibration of two cameras.
  • the above-mentioned calibration system further includes a calibration object; at least three non-collinear calibration holes are provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond to each other.
  • the above-mentioned calibration system also includes a calibration object; two rows of calibration holes are arranged on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond to each other, and the calibration holes in the same row are aligned.
  • the straight line where the hole is located is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows.
  • Non-volatile memory may include read-only memory (Read-Only Memory, ROM), magnetic tape, floppy disk, flash memory or optical memory, etc.
  • Volatile memory can include Random Access Memory (RAM) or external cache memory.
  • RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present application relates to a 3D camera calibration method and apparatus, and a calibration system. The method comprises: acquiring a scanning point cloud image of a first camera and a scanning point cloud image of a second camera to be calibrated, the scanning point cloud image being a point cloud image generated by scanning a calibration hole on a calibration object; extracting hole center coordinates of a calibration hole in the scanning point cloud image of the first camera to obtain first coordinates, and extracting hole center coordinates of a calibration hole in the scanning point cloud image of the second camera to obtain second coordinates; deriving a normal vector of a plane in which the calibration hole scanned by the first camera is located; calculating extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector, and preset calibration hole distance data; and computing coordinate conversion relationship data of the first camera and the second camera on the basis of the second coordinates and the extension coordinates. Using the present application, calibration of two cameras can be easily and efficiently accomplished without relying on a common field of view.

Description

3D相机标定方法、装置及标定系统3D camera calibration method, device and calibration system 技术领域technical field
本申请涉及机器视觉技术领域,特别是涉及一种3D相机标定方法、装置、及标定系统。The present application relates to the technical field of machine vision, in particular to a 3D camera calibration method, device, and calibration system.
背景技术Background technique
随着技术的发展,出现了可以拍摄立体图像的3D相机。在一些场合中,有时采用两个3D相机拍摄后进行图像处理,这时需要对相机进行标定,以得到两个相机的坐标转换关系。With the development of technology, 3D cameras that can take stereoscopic images have appeared. In some occasions, two 3D cameras are sometimes used for image processing after shooting. At this time, the cameras need to be calibrated to obtain the coordinate transformation relationship of the two cameras.
传统使用的标定方法中,通常需要有共同的视野,基于共同的视野进行相机标定。而针对没有共同视野的两个相机的标定,通常需要借助其他电子工具,比如借助激光测距仪来进行标定,操作复杂。In traditionally used calibration methods, a common field of view is usually required, and camera calibration is performed based on the common field of view. For the calibration of two cameras that do not have a common field of view, other electronic tools, such as a laser rangefinder, are usually required for calibration, and the operation is complicated.
发明内容Contents of the invention
基于此,有必要针对上述技术问题,提供一种不需要共同视野、简单高效的3D相机标定方法、装置及标定系统。Based on this, it is necessary to provide a simple and efficient 3D camera calibration method, device and calibration system that does not require a common field of view to address the above technical problems.
一种3D相机标定方法,包括:A 3D camera calibration method, comprising:
获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;Obtain the scanned point cloud of the first camera to be calibrated and the second camera, the scanned point cloud is a point cloud generated by scanning the calibration hole on the calibration object;
提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;Extracting the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain a first coordinate, and extracting the center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain a second coordinate;
求取所述第一相机所扫描的标定孔所在平面的法向量;Obtain the normal vector of the plane where the calibration hole scanned by the first camera is located;
根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the preset calibration hole distance data;
根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。According to the second coordinates and the extension coordinates, coordinate conversion relationship data between the first camera and the second camera is obtained by calculation.
在其中一个实施例中,所述提取所述第一相机的扫描点云图中所述标定孔 的孔心坐标得到第一坐标,包括:In one of the embodiments, the extracting the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain the first coordinates includes:
提取所述第一相机的扫描点云图中所述标定孔的边缘数据点;Extracting edge data points of the calibration hole in the scanned point cloud image of the first camera;
根据所述边缘数据点提取所述标定孔的孔心坐标得到第一坐标。Extracting the center coordinates of the calibration holes according to the edge data points to obtain the first coordinates.
在其中一个实施例中,所述求取所述第一相机所扫描的标定孔所在平面的法向量,包括:In one of the embodiments, the obtaining the normal vector of the plane where the calibration hole scanned by the first camera includes:
选取所述标定物中所述第一相机所扫描的标定孔所在面上的至少3个不共线的点;Selecting at least 3 non-collinear points on the surface of the calibration hole scanned by the first camera in the calibration object;
根据所选取的点确定平面;Determine the plane according to the selected points;
求取所述平面的法向量。Find the normal vector of the plane.
在其中一个实施例中,所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;所述第一相机的扫描点云图为所述第一相机扫描所述标定物一表面标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定物另一表面标定孔所生成的点云图。In one of the embodiments, at least 3 non-collinear calibration holes are provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond; the scanning point cloud image of the first camera is the first A point cloud image generated by a camera scanning a calibration hole on a surface of the calibration object, and a point cloud image scanned by the second camera is a point cloud image generated by the second camera scanning a calibration hole on the other surface of the calibration object.
在其中一个实施例中,所述标定孔距离数据包括标定物宽度,所述标定物宽度为所述标定物开设标定孔的相对两表面之间的宽度;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标,包括:In one of the embodiments, the calibration hole distance data includes the calibration object width, and the calibration object width is the width between the two opposite surfaces of the calibration object on which the calibration hole is set; according to the first coordinates, the The normal vector and the preset calibration hole distance data are used to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, including:
根据所述第一坐标、所述法向量和所述标定物宽度,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。According to the first coordinates, the normal vector and the width of the calibration object, the extension coordinates of the calibration hole scanned by the second camera in the first camera are estimated.
在其中一个实施例中,所述标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直;In one of the embodiments, two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes of the same row are in a straight line, and the positions of the calibration holes of the two rows correspond one by one, and the straight line where the calibration holes of the same row are located is the same as that of the two rows. The line between the two calibration holes corresponding to the middle position is vertical;
所述第一相机的扫描点云图为所述第一相机扫描所述标定表面的一排标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定表面的另一排标定孔所生成的点云图。The scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface, and the scanning point cloud image of the second camera is the scanning point cloud image of the calibration hole scanned by the second camera. The point cloud image generated by another row of calibration holes on the surface.
在其中一个实施例中,所述标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐 标,包括:In one of the embodiments, the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; Hole distance data, calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera, including:
根据所述第一相机所扫描的多个标定孔的第一坐标,拟合所述第一相机所扫描的标定孔所在直线的直线方程;According to the first coordinates of the plurality of calibration holes scanned by the first camera, the linear equation of the straight line where the calibration holes scanned by the first camera is fitted;
根据所述第一坐标、所述法向量、所述直线方程以及所述孔间距,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。According to the first coordinates, the normal vector, the straight line equation and the hole spacing, the extension coordinates of the calibration holes scanned by the second camera in the first camera are calculated.
一种3D相机标定装置,包括:A 3D camera calibration device, comprising:
图获取模块,用于获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;The image acquisition module is used to obtain the scanned point cloud images of the first camera and the second camera to be calibrated, and the scanned point cloud images are point cloud images generated by scanning the calibration holes on the calibration object;
坐标提取模块,用于提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;A coordinate extraction module, configured to extract the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain first coordinates, and extract the center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain the second coordinate;
法向量获取模块,用于求取所述第一相机所扫描的标定孔所在平面的法向量;A normal vector obtaining module, configured to obtain the normal vector of the plane where the calibration hole scanned by the first camera is located;
坐标映射模块,用于根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;A coordinate mapping module, configured to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the preset calibration hole distance data;
关系获取模块,用于根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。A relationship acquiring module, configured to calculate coordinate conversion relationship data between the first camera and the second camera according to the second coordinates and the extended coordinates.
一种标定系统,包括:A calibration system comprising:
第一相机,用于扫描标定物上的标定孔生成点云图得到所述第一相机的扫描点云图;The first camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the first camera;
第二相机,用于扫描所述标定物上的标定孔生成点云图得到所述第二相机的扫描点云图;The second camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the second camera;
控制器,所述控制器连接所述第一相机和所述第二相机,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现如下步骤:A controller, the controller is connected to the first camera and the second camera, and includes a memory and a processor, the memory stores a computer program, and the processor implements the following steps when executing the computer program:
获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;Obtain the scanned point cloud of the first camera to be calibrated and the second camera, the scanned point cloud is a point cloud generated by scanning the calibration hole on the calibration object;
提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标, 提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;Extracting the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain a first coordinate, and extracting the center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain a second coordinate;
求取所述第一相机所扫描的标定孔所在平面的法向量;Obtain the normal vector of the plane where the calibration hole scanned by the first camera is located;
根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the preset calibration hole distance data;
根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。According to the second coordinates and the extension coordinates, coordinate conversion relationship data between the first camera and the second camera is obtained by calculation.
在其中一个实施例中,上述标定系统还包括所述标定物;In one of the embodiments, the above-mentioned calibration system also includes the calibration object;
所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;或者所述标定物的标定表面上设置两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。At least 3 non-collinear calibration holes are provided on the opposite two surfaces of the calibration object, and the positions of the calibration holes on the opposite two surfaces correspond; or two rows of calibration holes are set on the calibration surface of the calibration object, and the calibration holes in the same row Form a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the line where the calibration holes in the same row are located is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows.
上述3D相机标定方法、装置及标定系统,通过根据第一相机扫描标定物上标定孔所生成的扫描点云图,提取第一相机所扫描标定孔在第一相机中的第一坐标,根据第二相机扫描标定物上标定孔所生成的扫描点云图,提取第二相机所扫描标定孔在第二相机中的第二坐标,然后根据第一坐标、第一相机所扫描标定孔所在平面的法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,最后基于第二坐标和延伸坐标进行对应关系计算,得到第一相机和第二相机两个坐标系的转换关系数据;如此,不依赖公共视野,可以应用于不具备公共视野的两个相机的标定,仅通过一块标定物,就能简单高效的进行两个相机的标定。The above-mentioned 3D camera calibration method, device and calibration system extract the first coordinates of the calibration hole scanned by the first camera in the first camera through the scanned point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and according to the second The camera scans the scanned point cloud image generated by the calibration hole on the calibration object, extracts the second coordinate of the calibration hole scanned by the second camera in the second camera, and then according to the first coordinate and the normal vector of the plane where the calibration hole scanned by the first camera And the calibration hole distance data, calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, and finally calculate the corresponding relationship based on the second coordinates and the extension coordinates, and obtain the two coordinate systems of the first camera and the second camera Convert relational data; in this way, it can be applied to the calibration of two cameras that do not have a common field of view without relying on the common field of view. Only one calibration object can be used to calibrate two cameras simply and efficiently.
附图说明Description of drawings
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the technical solutions in the embodiments of the present application or the prior art, the following will briefly introduce the drawings that need to be used in the description of the embodiments or the prior art. Obviously, the accompanying drawings in the following description are only These are some embodiments of the present application. Those skilled in the art can also obtain other drawings based on these drawings without creative work.
图1为一个实施例中3D相机标定方法的流程示意图;Fig. 1 is a schematic flow chart of a 3D camera calibration method in an embodiment;
图2为一个实施例中两相机对扫的情况示意图;Fig. 2 is a schematic diagram of the situation of two cameras scanning in one embodiment;
图3为一个实施例中标定物上相对两侧的两个点的示意图;Fig. 3 is a schematic diagram of two points on opposite sides on the calibration object in one embodiment;
图4为两相机对扫的标定流程示意图;Figure 4 is a schematic diagram of the calibration process of two cameras scanning;
图5一个实施例中两相机扫描拼接的情况示意图;Fig. 5 is a schematic diagram of the situation of scanning stitching of two cameras in one embodiment;
图6为两相机拼接扫描的标定流程示意图;Figure 6 is a schematic diagram of the calibration process of two-camera splicing scanning;
图7为一个实施例中3D相机标定装置的结构框图。Fig. 7 is a structural block diagram of a 3D camera calibration device in an embodiment.
具体实施方式detailed description
为了便于理解本申请,下面将参照相关附图对本申请进行更全面的描述。附图中给出了本申请的实施例。但是,本申请可以以许多不同的形式来实现,并不限于本文所描述的实施例。相反地,提供这些实施例的目的是使本申请的公开内容更加透彻全面。In order to facilitate the understanding of the present application, the present application will be described more fully below with reference to the relevant drawings. Embodiments of the application are given in the drawings. However, the present application can be embodied in many different forms and is not limited to the embodiments described herein. On the contrary, the purpose of providing these embodiments is to make the disclosure of this application more thorough and comprehensive.
除非另有定义,本文所使用的所有的技术和科学术语与属于本申请的技术领域的技术人员通常理解的含义相同。本文中在本申请的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本申请。Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the technical field to which this application belongs. The terms used herein in the specification of the application are only for the purpose of describing specific embodiments, and are not intended to limit the application.
可以理解,本申请所使用的术语“第一”、“第二”等可在本文中用于描述各种元件,但这些元件不受这些术语限制。这些术语仅用于将第一个元件与另一个元件区分。It can be understood that the terms "first", "second" and the like used in this application may be used to describe various elements herein, but these elements are not limited by these terms. These terms are only used to distinguish one element from another element.
需要说明的是,当一个元件被认为是“连接”另一个元件时,它可以是直接连接到另一个元件,或者通过居中元件连接另一个元件。此外,以下实施例中的“连接”,如果被连接的对象之间具有电信号或数据的传递,则应理解为“电连接”、“通信连接”等。It should be noted that when an element is considered to be "connected" to another element, it may be directly connected to the other element, or connected to the other element through an intervening element. In addition, "connection" in the following embodiments should be understood as "electrical connection", "communication connection" and the like if there is transmission of electrical signals or data between the connected objects.
在此使用时,单数形式的“一”、“一个”和“所述/该”也可以包括复数形式,除非上下文清楚指出另外的方式。还应当理解的是,术语“包括/包含”或“具有”等指定所陈述的特征、整体、步骤、操作、组件、部分或它们的组合的存在,但是不排除存在或添加一个或更多个其他特征、整体、步骤、操作、组件、部分或它们的组合的可能性。When used herein, the singular forms "a", "an" and "the/the" may also include the plural forms unless the context clearly dictates otherwise. It should also be understood that the terms "comprising/comprising" or "having" etc. specify the presence of stated features, integers, steps, operations, components, parts or combinations thereof, but do not exclude the presence or addition of one or more The possibility of other features, integers, steps, operations, components, parts or combinations thereof.
在一个实施例中,如图1所示,提供了一种3D相机标定方法,可以应用于标定处理所用的器件,比如控制器。以应用于控制器为例,该方法包括如下步骤:In one embodiment, as shown in FIG. 1 , a 3D camera calibration method is provided, which can be applied to a device used for calibration processing, such as a controller. Taking the application to the controller as an example, the method includes the following steps:
S110:获取待标定的第一相机和第二相机的扫描点云图,扫描点云图为扫描标定物上的标定孔所生成的点云图。S110: Obtain scanned point cloud images of the first camera and the second camera to be calibrated, where the scanned point cloud images are point cloud images generated by scanning the calibration holes on the calibration object.
第一相机和第二相机是需要标定的3D相机。其中,标定物是标定第一相机和第二相机时所用到的物件,比如木块,标定物上开设孔作为标定孔,标定孔的数量为多个。具体地,第一相机扫描标定物上的一部分标定孔,第二相机扫描标定物上的另一部分标定孔,即第一相机和第二相机扫描的标定孔不同,第一相机和第二相机没有公共视野。具体地,标定孔可以是圆孔,也可以是其他形状的孔。标定处理之前,先采用第一相机和第二相机扫描标定物上的标定孔、各自生成扫描点云图;控制器获取第一相机的扫描点云图和第二相机的扫描点云图。The first camera and the second camera are 3D cameras that need to be calibrated. Wherein, the calibration object is an object used for calibrating the first camera and the second camera, such as a wooden block, and holes are opened on the calibration object as calibration holes, and the number of calibration holes is multiple. Specifically, the first camera scans a part of the calibration holes on the calibration object, and the second camera scans another part of the calibration holes on the calibration object, that is, the calibration holes scanned by the first camera and the second camera are different, and the first camera and the second camera have no public view. Specifically, the calibration hole may be a round hole, or a hole of other shapes. Before the calibration process, the first camera and the second camera are used to scan the calibration hole on the calibration object to generate scanned point cloud images; the controller acquires the scanned point cloud images of the first camera and the scanned point cloud images of the second camera.
S130:提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标,提取第二相机的扫描点云图中标定孔的孔心坐标得到第二坐标。S130: Extracting the coordinates of the center of the calibration hole in the scanned point cloud image of the first camera to obtain a first coordinate, and extracting the coordinates of the center of the calibration hole in the scanned point cloud image of the second camera to obtain a second coordinate.
标定孔的孔心坐标是指标定孔中心的坐标,比如以标定孔是圆孔为例,标定孔的孔心坐标即为圆心坐标。第一坐标是第一相机所扫描的标定孔在第一相机的坐标系中的坐标值,第二坐标是第二相机所扫描的标定孔在第二相机的坐标系中的坐标值。具体地,第一坐标和第二坐标为三维坐标值。The coordinates of the center of the calibration hole are the coordinates of the center of the indexed hole. For example, if the calibration hole is a circular hole, the coordinates of the center of the calibration hole are the coordinates of the center of the circle. The first coordinate is the coordinate value of the calibration hole scanned by the first camera in the coordinate system of the first camera, and the second coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the second camera. Specifically, the first coordinate and the second coordinate are three-dimensional coordinate values.
具体地,第一相机扫描多个标定孔,对于第一相机的扫描点云图中每一个标定孔,均获得孔心坐标得到对应的第一坐标,即,基于第一相机的扫描点云图可以得到多个第一坐标。同理,基于第二相机的扫描点云图可以得到多个第二坐标。Specifically, the first camera scans a plurality of calibration holes, and for each calibration hole in the scanned point cloud image of the first camera, the coordinates of the center of the hole are obtained to obtain the corresponding first coordinates, that is, based on the scanned point cloud image of the first camera, it can be obtained Multiple first coordinates. Similarly, multiple second coordinates can be obtained based on the point cloud image scanned by the second camera.
S150:求取第一相机所扫描的标定孔所在平面的法向量。S150: Calculate the normal vector of the plane where the calibration hole scanned by the first camera is located.
标定物具有多个表面,同一个相机所扫描的标定孔位于标定物上的同一个表面,比如,第一相机所扫描的标定孔均位于标定物的同一个面。The calibration object has multiple surfaces, and the calibration holes scanned by the same camera are located on the same surface of the calibration object, for example, the calibration holes scanned by the first camera are located on the same surface of the calibration object.
S170:根据第一坐标、法向量以及预设的标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。S170: Estimate the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the preset calibration hole distance data.
其中,标定孔距离数据表征第一相机所扫描的标定孔与第二相机所扫描的标定孔之间的距离。延伸坐标,是指第二相机所扫描的标定孔的位置在第一相机的坐标系中的坐标值。根据第一相机对应的第一坐标、法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,即映射得到第二相机所扫描的标定孔在第一相机的坐标系中的坐标值。Wherein, the calibration hole distance data represents the distance between the calibration hole scanned by the first camera and the calibration hole scanned by the second camera. The extended coordinate refers to the coordinate value of the position of the calibration hole scanned by the second camera in the coordinate system of the first camera. According to the first coordinates, normal vector and calibration hole distance data corresponding to the first camera, the extended coordinates of the calibration hole scanned by the second camera in the first camera are calculated, that is, the calibration hole scanned by the second camera is mapped to the first Coordinate values in the camera's coordinate system.
S190:根据第二坐标和延伸坐标,计算得到第一相机与第二相机的坐标转换关系数据。S190: According to the second coordinates and the extended coordinates, calculate and obtain the coordinate transformation relationship data between the first camera and the second camera.
第二坐标是第二相机所扫描的标定孔在第二相机的坐标系中的坐标值,延伸坐标是第二相机所扫描的标定孔在第一相机的坐标系中的坐标值;则,同一个标定孔对应的第二坐标和延伸坐标是不同坐标系中的坐标值。通过根据第二坐标和延伸坐标进行对应关系计算,得到两个坐标系的转换关系数据,从而得到第一相机与第二相机的坐标系转换关系数据。The second coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the second camera, and the extended coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the first camera; then, the same The second coordinate and the extension coordinate corresponding to a calibration hole are coordinate values in different coordinate systems. By performing correspondence calculation according to the second coordinates and the extended coordinates, the transformation relation data of the two coordinate systems are obtained, thereby obtaining the coordinate system transformation relation data of the first camera and the second camera.
其中,转换关系数据是表征第一相机的坐标和第二相机的坐标之间的转换关系的数据,具体可以为矩阵。比如,以第二相机扫描3个以上标定孔为例,第二坐标和延伸坐标为三维坐标值,每个标定孔的第二坐标分别对应为矩阵的一行,基于多个标定孔的第二坐标得到一个原始矩阵;每个标定孔的延伸坐标分别对应为另一矩阵的一行,基于多个标定孔的延伸坐标得到一个延伸矩阵,然后采用数学算法计算从原始矩阵变换到延伸矩阵或者从延伸矩阵变换到原始矩阵所需要的数据,得到转换矩阵。Wherein, the transformation relation data is data representing a transformation relation between the coordinates of the first camera and the coordinates of the second camera, and may specifically be a matrix. For example, taking the second camera scanning more than 3 calibration holes as an example, the second coordinates and extension coordinates are three-dimensional coordinate values, and the second coordinates of each calibration hole correspond to a row of the matrix, based on the second coordinates of multiple calibration holes Obtain an original matrix; the extension coordinates of each calibration hole correspond to a row of another matrix, and obtain an extension matrix based on the extension coordinates of multiple calibration holes, and then use mathematical algorithms to calculate the transformation from the original matrix to the extension matrix or from the extension matrix Transform the data needed to transform to the original matrix to get the transformation matrix.
上述3D相机标定方法,通过根据第一相机扫描标定物上标定孔所生成的扫描点云图,提取第一相机所扫描标定孔在第一相机中的第一坐标,根据第二相机扫描标定物上标定孔所生成的扫描点云图,提取第二相机所扫描标定孔在第二相机中的第二坐标,然后根据第一坐标、第一相机所扫描标定孔所在平面的法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,最后基于第二坐标和延伸坐标进行对应关系计算,得到第一相机和第二相机两个坐标系的转换关系数据;如此,不依赖公共视野,可以应用于不具备公共视野的两个相机的标定,仅通过一块标定物,就能简单高效的进行两个相机的标定。The above-mentioned 3D camera calibration method extracts the first coordinates of the calibration hole scanned by the first camera in the first camera from the scanned point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and scans the calibration hole on the calibration object according to the second camera. From the scanned point cloud image generated by the calibration hole, extract the second coordinate of the calibration hole scanned by the second camera in the second camera, and then according to the first coordinate, the normal vector of the plane where the calibration hole is scanned by the first camera, and the distance data of the calibration hole , to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, and finally calculate the corresponding relationship based on the second coordinates and the extension coordinates, and obtain the conversion relationship data of the two coordinate systems of the first camera and the second camera; , does not depend on the common field of view, and can be applied to the calibration of two cameras that do not have a common field of view. Only one calibration object can be used to calibrate two cameras simply and efficiently.
在其中一个实施例中,步骤S130中提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标,包括:提取第一相机的扫描点云图中标定孔的边缘数据点;根据边缘数据点提取标定孔的孔心坐标得到第一坐标。In one of the embodiments, in step S130, extracting the coordinates of the center of the calibration hole in the scanning point cloud image of the first camera to obtain the first coordinates includes: extracting the edge data points of the calibration hole in the scanning point cloud image of the first camera; Data points are extracted from the center coordinates of the calibration holes to obtain the first coordinates.
提取标定孔的边缘数据点,即确定标定孔的边缘。通过先确定标定孔的边缘,再基于边缘确定标定孔的中心点的坐标,准确性高。具体地,利用梯度法提取标定孔的边缘数据点,可以准确确定边缘;利用空间圆心拟合算法提取标定孔的孔心坐标,可以准确获取中心点的坐标。例如,以标定孔是圆孔为例,利用梯度法提取圆孔的边缘数据点,利用空间圆心拟合算法提取圆孔的圆心坐标。Extract the edge data points of the calibration hole, that is, determine the edge of the calibration hole. By first determining the edge of the calibration hole, and then determining the coordinates of the center point of the calibration hole based on the edge, the accuracy is high. Specifically, the gradient method is used to extract the edge data points of the calibration hole, and the edge can be accurately determined; the center point coordinates of the calibration hole can be accurately obtained by using the space center fitting algorithm to extract the center point of the calibration hole. For example, taking the calibration hole as a circular hole as an example, the edge data points of the circular hole are extracted using the gradient method, and the center coordinates of the circular hole are extracted using the spatial center fitting algorithm.
具体地,步骤S130中提取第二相机的扫描点云图中标定孔的孔心坐标得到第二坐标的步骤与提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标的具体步骤相同,在此不做赘述。Specifically, in step S130, the step of extracting the hole center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain the second coordinates is the same as the specific step of extracting the hole center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain the first coordinates The same, will not be repeated here.
在其中一个实施例中,步骤S150包括:选取标定物中第一相机所扫描的标定孔所在面上的至少3个不共线的点;根据所选取的点确定平面;求取平面的法向量。In one of the embodiments, step S150 includes: selecting at least 3 non-collinear points on the surface of the calibration hole scanned by the first camera in the calibration object; determining the plane according to the selected points; calculating the normal vector of the plane .
在第一相机所扫描的标定孔所在面上,选取至少3个不共线的点,3个不共线的点可以确定一个平面,然后求取选取的平面的法向量。具体地,确定的至少3个不共线的点是标定孔之外的点,与标定孔同平面。相比于直接采用标定孔确定平面,由于标定孔的坐标是通过求取孔心位置得到,有误差的可能性,单独另外取点确定平面的准确性更高。例如,第一相机扫描标定物左面的标定孔,在标定物左面除标定孔外,另外选取3个不共线的点,确定平面、求取法向量。具体地,可以通过平面拟合算法提取标定孔所在平面的法向量。On the surface where the calibration hole scanned by the first camera is located, select at least 3 non-collinear points, and the 3 non-collinear points can determine a plane, and then calculate the normal vector of the selected plane. Specifically, the determined at least three non-collinear points are points outside the calibration hole, which are on the same plane as the calibration hole. Compared with directly using the calibration hole to determine the plane, since the coordinates of the calibration hole are obtained by calculating the position of the center of the hole, there is a possibility of error, and the accuracy of determining the plane by taking another point alone is higher. For example, the first camera scans the calibration hole on the left side of the calibration object, and selects three non-collinear points besides the calibration hole on the left side of the calibration object to determine the plane and obtain the normal vector. Specifically, the normal vector of the plane where the calibration hole is located can be extracted through a plane fitting algorithm.
在其中一个实施例中,标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;第一相机的扫描点云图为第一相机扫描标定物一表面标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定物另一表面标定孔所生成的点云图。In one of the embodiments, at least 3 non-collinear calibration holes are respectively provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond; The point cloud image generated by the surface calibration hole, the point cloud image scanned by the second camera is the point cloud image generated by the second camera scanning the calibration hole on the other surface of the calibration object.
标定物是一个标准件,优选地,可以是理想的长方体。以标定物为长方体为例,标定物的相对两侧包括左侧和右侧、前侧和后侧、上侧和下侧,比如, 可以是在标定物的左侧和右侧各开设至少3个不共线的标定孔,两侧标定孔数量相等、位置一一对应,第一相机扫描左侧、第二相机扫描右侧。如此,本实施例适用于两相机对扫的情况。The calibration object is a standard part, preferably an ideal cuboid. Taking the calibration object as a cuboid as an example, the opposite sides of the calibration object include the left side and the right side, the front side and the rear side, the upper side and the lower side. For example, at least 3 The number of calibration holes on both sides is equal, and the positions correspond to each other. The first camera scans the left side and the second camera scans the right side. In this way, this embodiment is applicable to the situation where two cameras scan each other.
如图2,两相机设置在标定物左右两边,标定物的左侧以及右侧面上,打上高精度的若干圆孔(至少3个,且不在一条线上),并且保证两侧平面中的圆孔数量相等、圆孔位置一一对应。具体地,为保证圆孔的位置一一对应,可以将其打穿。第一相机(相机1)对左平面上各个标定孔进行扫描,第二相机(相机2)对右平面上各标定孔进行扫描,理论上扫描方向无需与标定物的表面平行。As shown in Figure 2, the two cameras are set on the left and right sides of the calibration object, and on the left and right sides of the calibration object, several high-precision round holes (at least 3, and not on a line) are punched, and ensure that the planes on both sides The number of round holes is equal, and the positions of the round holes correspond to each other. Specifically, in order to ensure a one-to-one correspondence between the positions of the round holes, they can be pierced. The first camera (camera 1) scans each calibration hole on the left plane, and the second camera (camera 2) scans each calibration hole on the right plane. In theory, the scanning direction does not need to be parallel to the surface of the calibration object.
在其中一个实施例中,标定孔距离数据包括标定物宽度,其中,标定物宽度为标定物开设标定孔的相对两表面之间的宽度。步骤S170包括:根据第一坐标、法向量和标定物宽度,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。In one embodiment, the distance data of the calibration hole includes the width of the calibration object, wherein the width of the calibration object is the width between two opposite surfaces of the calibration object on which the calibration hole is formed. Step S170 includes: calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the width of the calibration object.
对于两相机对扫的情况,通过第一相机的坐标系中的第一坐标、第一相机所扫描平面的法向量以及标定物标定有标定孔的两表面之间的固有宽度,求得第二相机扫描的标定孔在第一相机的坐标系中的坐标。具体地,可以根据几何算法计算坐标,如图3所示,以扫描标定物左侧的第一相机的视野为参考系,标定物左侧面上的A点坐标已知,标定物左右两侧之间的厚度d已知,法向量已知,可以根据法向量和A点的坐标确定过A点的直线方程,然后根据直线方程、A点的坐标和标定物的宽度(A点与B点之间的距离)求取右侧B点的坐标得到延伸坐标。For the situation where two cameras scan against each other, through the first coordinate in the coordinate system of the first camera, the normal vector of the plane scanned by the first camera, and the intrinsic width between the two surfaces marked with the calibration hole of the calibration object, the second The coordinates of the calibration hole scanned by the camera in the coordinate system of the first camera. Specifically, the coordinates can be calculated according to a geometric algorithm. As shown in Figure 3, the field of view of the first camera on the left side of the scanning calibration object is taken as the reference system, the coordinates of point A on the left side of the calibration object are known, and the left and right sides of the calibration object The thickness d between them is known, and the normal vector is known, and the straight line equation passing through point A can be determined according to the normal vector and the coordinates of point A, and then according to the straight line equation, the coordinates of point A and the width of the calibration object (point A and point B The distance between) calculate the coordinates of point B on the right side to obtain the extended coordinates.
如图4所示,为两相机对扫的标定流程示意图,标定程序启动后,相机1和相机2对标定物进行扫描生成点云图,然后使用得到的点云图,分别拟合左平面N个圆孔圆心在相机1中的三维坐标位置(圆心坐标)以及右平面对应的N个圆孔圆心在相机2中的三维坐标位置,接着利用左平面在相机1中的法向量以及标定物固有宽度,推算出对应的右平面N个圆孔圆心在相机1中的三维坐标,最后在得到右平面中N个圆孔圆心在相机2中的三维坐标后,计算出相机1以及相机2的坐标系转化关系得到转换矩阵,从而完成两个相机的标定。本申请的坐标处理均是在三维空间中完成,且对相机位置扫描方向等因素不敏感, 因此本申请在实际运用中的鲁棒性更好,精度更高。As shown in Figure 4, it is a schematic diagram of the calibration process of two-camera scanning. After the calibration program is started, camera 1 and camera 2 scan the calibration object to generate a point cloud image, and then use the obtained point cloud image to fit N circles on the left plane respectively. The three-dimensional coordinate position (circle center coordinate) of the center of the hole in camera 1 and the three-dimensional coordinate position of the center of the N circular holes corresponding to the right plane in camera 2, and then use the normal vector of the left plane in camera 1 and the intrinsic width of the calibration object, Calculate the corresponding three-dimensional coordinates of the centers of the N circular holes in the right plane in camera 1, and finally, after obtaining the three-dimensional coordinates of the centers of the N circular holes in the right plane in camera 2, calculate the coordinate system transformation of camera 1 and camera 2 The relationship obtains the transformation matrix, thus completing the calibration of the two cameras. The coordinate processing in this application is completed in three-dimensional space, and is not sensitive to factors such as camera position and scanning direction, so this application has better robustness and higher precision in practical applications.
在其中一个实施例中,标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。具体地,两排标定孔位置相对、且一一对应、数量相等,即,第一排的第一个标定孔与第二排的第一个标定孔位置对应,第一排的第二个标定孔与第二排的第二个标定孔位置对应,以此类推。以每排包括两个标定孔为例,两排中位置对应的两标定孔包括第一排中的第一个标定孔与第二排中的第一个标定孔、第一排的第二个标定孔与第二排中的第二个标定孔。以标定表面为方形为例,两排标定孔可以设置在标定表面的相对两边缘。In one of the embodiments, two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the straight line where the calibration holes are in the same row is in line with the position in the two rows. The connecting line between the corresponding two calibration holes is vertical. Specifically, the positions of the two rows of calibration holes are opposite, one-to-one, and equal in number, that is, the first calibration hole in the first row corresponds to the position of the first calibration hole in the second row, and the second calibration hole in the first row corresponds to the position of the first calibration hole in the second row. The hole corresponds to the position of the second calibration hole in the second row, and so on. Taking two calibration holes in each row as an example, the corresponding two calibration holes in the two rows include the first calibration hole in the first row, the first calibration hole in the second row, and the second calibration hole in the first row. Calibration hole and the second calibration hole in the second row. Taking the calibration surface as an example, two rows of calibration holes can be arranged on two opposite edges of the calibration surface.
具体地,第一相机的扫描点云图为第一相机扫描标定表面的一排标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定表面的另一排标定孔所生成的点云图。Specifically, the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface, and the scanning point cloud image of the second camera is generated by the second camera scanning another row of calibration holes on the calibration surface point cloud image.
本实施例适用于两相机拼接扫描的情况,即两个相机扫描的是物体的同一个表面的两边区域,而这两边区域因为没有公共视野无法直接拼接。需要说明的是,本申请用于3D相机,上述的拼接不光是左右的拼接,还有高度可能不等高的情况。This embodiment is applicable to the case where two cameras are spliced and scanned, that is, the two cameras scan the two sides of the same surface of the object, and the two sides cannot be directly spliced because there is no common field of view. It should be noted that this application is used for 3D cameras, and the above-mentioned splicing is not only the splicing of the left and right, but also the case that the heights may not be equal.
本实施例中,标定物的一个表面作为标定表面,标定表面的两边作为待检测区域,各自制作一排标定孔,每排标定孔的孔心需保证在同一直线上,两排的标定孔数量相等、位置一一对应,且需要保证同一排标定孔所在直线与不同排中位置对应的标定孔两两连线垂直,例如图5所示,标定物的上表面设置两排圆孔,第一相机扫描第一排标定孔,第二相机扫描第二排标定孔,最终计算得到两个相机坐标系的转换矩阵,即可进行拼接图像或者其他图像联合处理结果。本实施例中拼接实现对相机的视野并无要求,即不需要不同的图像包含公共视野,所使用3D相机视野只需覆盖检测区域即可,无需为了包含公共视野而选择大视野相机,可以保证检测精度。In this embodiment, one surface of the calibration object is used as the calibration surface, and the two sides of the calibration surface are used as the area to be detected. A row of calibration holes is made respectively. The center of each row of calibration holes must be on the same straight line. The number of calibration holes in the two rows Equal, one-to-one correspondence of positions, and it is necessary to ensure that the straight line where the same row of calibration holes is located is perpendicular to the two pairs of calibration holes corresponding to the positions in different rows. For example, as shown in Figure 5, two rows of round holes are set on the upper surface of the calibration object. The first The camera scans the first row of calibration holes, the second camera scans the second row of calibration holes, and finally calculates the transformation matrix of the two camera coordinate systems, which can be used to stitch images or other image joint processing results. In this embodiment, the splicing implementation does not require the field of view of the camera, that is, different images do not need to contain the common field of view, and the field of view of the 3D camera used only needs to cover the detection area. There is no need to select a large field of view camera to include the public field of view, which can ensure Detection accuracy.
在其中一个实施例中,标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;比如,孔间距可以等于第一排的第一个标定孔与第二排 的第一个标定孔之间的间距。步骤S170包括:根据第一相机所扫描的多个标定孔的第一坐标,拟合第一相机所扫描的标定孔所在直线的直线方程;根据第一坐标、法向量、直线方程以及孔间距,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。In one of the embodiments, the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; for example, the hole spacing can be equal to the first calibration hole in the first row and the second row. The spacing between the first calibration holes. Step S170 includes: according to the first coordinates of the plurality of calibration holes scanned by the first camera, fitting the straight line equation of the straight line where the calibration holes scanned by the first camera are located; according to the first coordinates, the normal vector, the straight line equation and the hole spacing, Estimate the extended coordinates of the calibration hole scanned by the second camera in the first camera.
对于两个相机扫描用于拼接的情况,通过第一坐标拟合所在直线方程,通过第一坐标、法向量、直线方程以及对应两标定孔之间的孔间距,求得第二相机所扫描标定孔在第一相机的坐标系中的坐标。具体地,可以根据几何算法计算坐标。例如图5中,C点位于第一排,D点位于第二排,根据与C点同排的多个标定孔所在的直线和平面的法向量可以唯一确定过D点的直线,从而可以求得CD两点所在直线的方程,再根据CD两点的直线方程、C点的坐标可以计算得到D点的坐标得到延伸坐标;同理,可以计算得到第二排中其他标定孔的延伸坐标。如图6所示,为一个实施例中拼接扫描的标定流程示意图。For the case where two cameras are scanned for splicing, the first coordinate is used to fit the straight line equation, and the first coordinate, normal vector, straight line equation and the hole spacing between the corresponding two calibration holes are used to obtain the calibration of the second camera scan The coordinates of the hole in the first camera's coordinate system. Specifically, the coordinates can be calculated according to geometrical algorithms. For example, in Figure 5, point C is located in the first row, and point D is located in the second row. According to the normal vectors of the straight lines and planes where the multiple calibration holes in the same row as point C are located, the straight line passing through point D can be uniquely determined, so that it can be obtained Obtain the equation of the straight line where the two points of CD are located, and then calculate the coordinates of point D to obtain the extension coordinates according to the equation of the straight line of the two points CD and the coordinates of point C; similarly, the extension coordinates of other calibration holes in the second row can be calculated. As shown in FIG. 6 , it is a schematic diagram of the calibration process of spliced scanning in one embodiment.
应该理解的是,虽然图1、图4、图6的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图1、图4、图6中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。It should be understood that although the steps in the flow charts of FIG. 1 , FIG. 4 , and FIG. 6 are shown sequentially as indicated by the arrows, these steps are not necessarily executed sequentially in the order indicated by the arrows. Unless otherwise specified herein, there is no strict order restriction on the execution of these steps, and these steps can be executed in other orders. Moreover, at least some of the steps in FIG. 1, FIG. 4, and FIG. 6 may include multiple steps or multiple stages, and these steps or stages are not necessarily performed at the same time, but may be performed at different times. These steps Or the execution sequence of the stages is not necessarily performed sequentially, but may be executed in turn or alternately with other steps or at least a part of steps or stages in other steps.
在一个实施例中,如图7所示,提供了一种3D相机标定装置,包括:图获取模块710、坐标提取模块730、法向量获取模块750、坐标映射模块770和关系获取模块790,其中:In one embodiment, as shown in FIG. 7 , a 3D camera calibration device is provided, including: a map acquisition module 710, a coordinate extraction module 730, a normal vector acquisition module 750, a coordinate mapping module 770 and a relationship acquisition module 790, wherein :
图获取模块710用于获取待标定的第一相机和第二相机的扫描点云图,扫描点云图为扫描标定物上的标定孔所生成的点云图;坐标提取模块730用于提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标,提取第二相机的扫描点云图中标定孔的孔心坐标得到第二坐标;法向量获取模块750用于求取 第一相机所扫描的标定孔所在平面的法向量;坐标映射模块770用于根据第一坐标、法向量以及预设的标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标;关系获取模块790用于根据第二坐标和延伸坐标,计算得到第一相机与第二相机的坐标转换关系数据。The image acquisition module 710 is used to obtain the scanned point cloud images of the first camera to be calibrated and the second camera, and the scanned point cloud image is a point cloud image generated by scanning the calibration hole on the calibration object; the coordinate extraction module 730 is used to extract the first camera's Scanning the center coordinates of the calibration hole in the point cloud image to obtain the first coordinates, extracting the center coordinates of the calibration hole in the scanning point cloud image of the second camera to obtain the second coordinates; The normal vector of the plane where the calibration hole is located; the coordinate mapping module 770 is used to calculate the extended coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the preset calibration hole distance data; relationship acquisition The module 790 is used to calculate the coordinate transformation relationship data between the first camera and the second camera according to the second coordinates and the extended coordinates.
上述3D相机标定装置,通过根据第一相机扫描标定物上标定孔所生成的扫描点云图,提取第一相机所扫描标定孔在第一相机中的第一坐标,根据第二相机扫描标定物上标定孔所生成的扫描点云图,提取第二相机所扫描标定孔在第二相机中的第二坐标,然后根据第一坐标、第一相机所扫描标定孔所在平面的法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,最后基于第二坐标和延伸坐标进行对应关系计算,得到第一相机和第二相机两个坐标系的转换关系数据;如此,不依赖公共视野,可以应用于不具备公共视野的两个相机的标定,仅通过一块标定物,就能简单高效的进行两个相机的标定。The above-mentioned 3D camera calibration device extracts the first coordinates of the calibration hole scanned by the first camera in the first camera by scanning the point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and scans the calibration hole on the calibration object with the second camera. From the scanned point cloud image generated by the calibration hole, extract the second coordinate of the calibration hole scanned by the second camera in the second camera, and then according to the first coordinate, the normal vector of the plane where the calibration hole is scanned by the first camera, and the distance data of the calibration hole , to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, and finally calculate the corresponding relationship based on the second coordinates and the extension coordinates, and obtain the conversion relationship data of the two coordinate systems of the first camera and the second camera; , does not depend on the common field of view, and can be applied to the calibration of two cameras that do not have a common field of view. Only one calibration object can be used to calibrate two cameras simply and efficiently.
在其中一个实施例中,坐标提取模块730用于提取第一相机的扫描点云图中标定孔的边缘数据点;根据第一相机的扫描点云图中标定孔的边缘数据点提取标定孔的孔心坐标得到第一坐标;提取第二相机的扫描点云图中标定孔的边缘数据点;根据第二相机的扫描点云图中标定孔的边缘数据点提取标定孔的孔心坐标得到第二坐标。In one of the embodiments, the coordinate extraction module 730 is used to extract the edge data points of the calibration hole in the scanned point cloud image of the first camera; extract the center of the calibration hole according to the edge data points of the calibration hole in the scanned point cloud image of the first camera Coordinates to obtain the first coordinates; extract the edge data points of the calibration holes in the scanned point cloud image of the second camera; extract the center coordinates of the calibration holes according to the edge data points of the calibration holes in the scanned point cloud image of the second camera to obtain the second coordinates.
在其中一个实施例中,法向量获取模块750选取标定物中第一相机所扫描的标定孔所在面上的至少3个不共线的点;根据所选取的点确定平面;求取平面的法向量。In one of the embodiments, the normal vector acquisition module 750 selects at least 3 non-collinear points on the surface where the calibration hole scanned by the first camera in the calibration object is located; the plane is determined according to the selected points; the method for obtaining the plane vector.
在其中一个实施例中,标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;第一相机的扫描点云图为第一相机扫描标定物一表面标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定物另一表面标定孔所生成的点云图。In one of the embodiments, at least 3 non-collinear calibration holes are respectively provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond; The point cloud image generated by the surface calibration hole, the point cloud image scanned by the second camera is the point cloud image generated by the second camera scanning the calibration hole on the other surface of the calibration object.
对应地,标定孔距离数据包括标定物宽度,其中,标定物宽度为标定物开设标定孔的相对两表面之间的宽度。坐标映射模块770根据第一坐标、法向量和标定物宽度,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。Correspondingly, the calibration hole distance data includes the calibration object width, wherein the calibration object width is the width between two opposite surfaces of the calibration object on which the calibration hole is set. The coordinate mapping module 770 calculates the extended coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the width of the calibration object.
在其中一个实施例中,标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直;第一相机的扫描点云图为第一相机扫描标定表面的一排标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定表面的另一排标定孔所生成的点云图。In one of the embodiments, two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the straight line where the calibration holes are in the same row is in line with the position in the two rows. The line between the corresponding two calibration holes is vertical; the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface, and the scanning point cloud image of the second camera is the scanning calibration image of the second camera. The point cloud image generated by another row of calibration holes on the surface.
对应地,标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;坐标映射模块770根据第一相机所扫描的多个标定孔的第一坐标,拟合第一相机所扫描的标定孔所在直线的直线方程;根据第一坐标、法向量、直线方程以及孔间距,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。Correspondingly, the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; the coordinate mapping module 770 fits the first coordinates of the multiple calibration holes scanned by the first camera. The linear equation of the straight line where the calibration hole scanned by the camera is located; according to the first coordinate, the normal vector, the linear equation and the hole spacing, the extended coordinates of the calibration hole scanned by the second camera in the first camera are calculated.
关于3D相机标定装置的具体限定可以参见上文中对于3D相机标定方法的限定,在此不再赘述。上述3D相机标定装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于控制器中的处理器中,也可以以软件形式存储于控制器中的存储器中,以便于处理器调用执行以上各个模块对应的操作。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。For specific limitations on the 3D camera calibration device, refer to the above-mentioned limitations on the 3D camera calibration method, which will not be repeated here. Each module in the above-mentioned 3D camera calibration device can be fully or partially realized by software, hardware and a combination thereof. The above-mentioned modules can be embedded in or independent of the processor in the controller in the form of hardware, and can also be stored in the memory of the controller in the form of software, so that the processor can invoke and execute the corresponding operations of the above-mentioned modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
在一个实施例中,提供了一种标定系统,包括第一相机、第二相机和控制器。其中,第一相机用于扫描标定物上的标定孔生成点云图得到第一相机的扫描点云图;第二相机用于扫描标定物上的标定孔生成点云图得到第二相机的扫描点云图;控制器,连接第一相机和第二相机,包括存储器和处理器,存储器存储有计算机程序,处理器执行计算机程序时实现上述各方法实施例的步骤。具体地,第一相机和第二相机扫描的标定孔不同。In one embodiment, a calibration system is provided, including a first camera, a second camera and a controller. Wherein, the first camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the first camera; the second camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the second camera; The controller is connected to the first camera and the second camera, and includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program. Specifically, the calibration holes scanned by the first camera and the second camera are different.
上述标定系统,由于包括了可以实现上述各方法实施例的步骤的控制器,同理,不需要依赖于公共视野,可以简单高效的实现两个相机的标定。Since the above-mentioned calibration system includes a controller capable of implementing the steps of the above-mentioned method embodiments, similarly, it does not need to rely on the public field of view, and can simply and efficiently realize the calibration of two cameras.
在其中一个实施例中,上述标定系统还包括标定物;标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应。In one of the embodiments, the above-mentioned calibration system further includes a calibration object; at least three non-collinear calibration holes are provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond to each other.
在其中一个实施例中,上述标定系统还包括标定物;标定物的标定表面上设置两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。In one of the embodiments, the above-mentioned calibration system also includes a calibration object; two rows of calibration holes are arranged on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond to each other, and the calibration holes in the same row are aligned. The straight line where the hole is located is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows.
关于标定系统中控制器的处理器所执行的计算机程序,具体限定可以参见上文中对于3D相机标定方法的限定,在此不再赘述。Regarding the computer program executed by the processor of the controller in the calibration system, for specific limitations, please refer to the above-mentioned limitations on the 3D camera calibration method, which will not be repeated here.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和易失性存储器中的至少一种。非易失性存储器可包括只读存储器(Read-Only Memory,ROM)、磁带、软盘、闪存或光存储器等。易失性存储器可包括随机存取存储器(Random Access Memory,RAM)或外部高速缓冲存储器。作为说明而非局限,RAM可以是多种形式,比如静态随机存取存储器(Static Random Access Memory,SRAM)或动态随机存取存储器(Dynamic Random Access Memory,DRAM)等。Those of ordinary skill in the art can understand that all or part of the processes in the methods of the above embodiments can be implemented through computer programs to instruct related hardware, and the computer programs can be stored in a non-volatile computer-readable memory In the medium, when the computer program is executed, it may include the processes of the embodiments of the above-mentioned methods. Wherein, any references to memory, storage, database or other media used in the various embodiments provided in the present application may include at least one of non-volatile memory and volatile memory. Non-volatile memory may include read-only memory (Read-Only Memory, ROM), magnetic tape, floppy disk, flash memory or optical memory, etc. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM).
在本说明书的描述中,参考术语“有些实施例”、“其他实施例”、“理想实施例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特征包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性描述不一定指的是相同的实施例或示例。In the description of this specification, descriptions referring to the terms "some embodiments", "other embodiments", "ideal embodiments" and the like mean that specific features, structures, materials, or characteristics described in connection with the embodiments or examples are included in this specification. In at least one embodiment or example of the invention. In this specification, schematic descriptions of the above terms do not necessarily refer to the same embodiment or example.
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。The technical features of the above embodiments can be combined arbitrarily. To make the description concise, all possible combinations of the technical features in the above embodiments are not described. However, as long as there is no contradiction in the combination of these technical features, they should be It is considered to be within the range described in this specification.
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。The above-mentioned embodiments only represent several implementation modes of the present application, and the description thereof is relatively specific and detailed, but it should not be construed as limiting the scope of the patent for the invention. It should be noted that those skilled in the art can make several modifications and improvements without departing from the concept of the present application, and these all belong to the protection scope of the present application. Therefore, the scope of protection of the patent application should be based on the appended claims.

Claims (10)

  1. 一种3D相机标定方法,其特征在于,包括:A 3D camera calibration method, characterized in that, comprising:
    获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;Obtain the scanned point cloud of the first camera to be calibrated and the second camera, the scanned point cloud is a point cloud generated by scanning the calibration hole on the calibration object;
    提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;Extracting the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain a first coordinate, and extracting the center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain a second coordinate;
    求取所述第一相机所扫描的标定孔所在平面的法向量;Obtain the normal vector of the plane where the calibration hole scanned by the first camera is located;
    根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the preset calibration hole distance data;
    根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。According to the second coordinates and the extension coordinates, coordinate conversion relationship data between the first camera and the second camera is obtained by calculation.
  2. 根据权利要求1所述的方法,其特征在于,所述提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,包括:The method according to claim 1, wherein said extracting the coordinates of the center of the calibration hole in the scanned point cloud image of the first camera to obtain the first coordinates comprises:
    提取所述第一相机的扫描点云图中所述标定孔的边缘数据点;Extracting edge data points of the calibration hole in the scanned point cloud image of the first camera;
    根据所述边缘数据点提取所述标定孔的孔心坐标得到第一坐标。Extracting the center coordinates of the calibration holes according to the edge data points to obtain the first coordinates.
  3. 根据权利要求1所述的方法,其特征在于,所述求取所述第一相机所扫描的标定孔所在平面的法向量,包括:The method according to claim 1, wherein the obtaining the normal vector of the plane where the calibration hole scanned by the first camera comprises:
    选取所述标定物中所述第一相机所扫描的标定孔所在面上的至少3个不共线的点;Selecting at least 3 non-collinear points on the surface of the calibration hole scanned by the first camera in the calibration object;
    根据所选取的点确定平面;Determine the plane according to the selected points;
    求取所述平面的法向量。Find the normal vector of the plane.
  4. 根据权利要求1-3中任意一项所述的方法,其特征在于,所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;所述第一相机的扫描点云图为所述第一相机扫描所述标定物一表面标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定物另一表面标定孔所生成的点云图。The method according to any one of claims 1-3, wherein at least 3 non-collinear calibration holes are provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond; The scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning the calibration hole on the surface of the calibration object, and the scanning point cloud image of the second camera is the scanning point cloud image of the calibration object scanned by the second camera. A point cloud generated by calibration holes on a surface.
  5. 根据权利要求4所述的方法,其特征在于,所述标定孔距离数据包括标定物宽度,所述标定物宽度为所述标定物开设标定孔的相对两表面之间的宽度;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第 二相机所扫描的标定孔在所述第一相机中的延伸坐标,包括:The method according to claim 4, wherein the calibration hole distance data includes the calibration object width, and the calibration object width is the width between the two opposite surfaces of the calibration hole on which the calibration object is set; The first coordinate, the normal vector and the preset calibration hole distance data are used to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, including:
    根据所述第一坐标、所述法向量和所述标定物宽度,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。According to the first coordinates, the normal vector and the width of the calibration object, the extension coordinates of the calibration hole scanned by the second camera in the first camera are estimated.
  6. 根据权利要求1-3中任意一项所述的方法,其特征在于,所述标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直;According to the method described in any one of claims 1-3, it is characterized in that two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes of the same row are in a straight line, and the calibration holes of the two rows are in the same position. One corresponding, the straight line where the calibration holes are located in the same row is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows;
    所述第一相机的扫描点云图为所述第一相机扫描所述标定表面的一排标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定表面的另一排标定孔所生成的点云图。The scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface, and the scanning point cloud image of the second camera is the scanning point cloud image of the calibration hole scanned by the second camera. The point cloud image generated by another row of calibration holes on the surface.
  7. 根据权利要求6所述的方法,其特征在于,所述标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标,包括:The method according to claim 6, wherein the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; Vector and preset calibration hole distance data to calculate the extended coordinates of the calibration hole scanned by the second camera in the first camera, including:
    根据所述第一相机所扫描的多个标定孔的第一坐标,拟合所述第一相机所扫描的标定孔所在直线的直线方程;According to the first coordinates of the plurality of calibration holes scanned by the first camera, the linear equation of the straight line where the calibration holes scanned by the first camera is fitted;
    根据所述第一坐标、所述法向量、所述直线方程以及所述孔间距,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。According to the first coordinates, the normal vector, the straight line equation and the hole spacing, the extension coordinates of the calibration holes scanned by the second camera in the first camera are calculated.
  8. 一种3D相机标定装置,其特征在于,包括:A 3D camera calibration device is characterized in that it comprises:
    图获取模块,用于获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;The image acquisition module is used to obtain the scanned point cloud images of the first camera to be calibrated and the second camera, and the scanned point cloud images are point cloud images generated by scanning the calibration holes on the calibration object;
    坐标提取模块,用于提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;A coordinate extraction module, configured to extract the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain first coordinates, and extract the center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain the second coordinate;
    法向量获取模块,用于求取所述第一相机所扫描的标定孔所在平面的法向量;A normal vector obtaining module, configured to obtain the normal vector of the plane where the calibration hole scanned by the first camera is located;
    坐标映射模块,用于根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;A coordinate mapping module, configured to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the preset calibration hole distance data;
    关系获取模块,用于根据所述第二坐标和所述延伸坐标,计算得到所述第 一相机与所述第二相机的坐标转换关系数据。A relationship acquiring module, configured to calculate coordinate transformation relationship data between the first camera and the second camera according to the second coordinates and the extended coordinates.
  9. 一种标定系统,其特征在于,包括:A calibration system, characterized in that it comprises:
    第一相机,用于扫描标定物上的标定孔生成点云图得到所述第一相机的扫描点云图;The first camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the first camera;
    第二相机,用于扫描所述标定物上的标定孔生成点云图得到所述第二相机的扫描点云图;The second camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the second camera;
    控制器,所述控制器连接所述第一相机和所述第二相机,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现权利要求1-7中任意一项所述方法的步骤。A controller, the controller is connected to the first camera and the second camera, and includes a memory and a processor, the memory stores a computer program, and the processor implements claims 1-7 when executing the computer program The steps of any one of the described methods.
  10. 根据权利要求9所述的标定系统,其特征在于,还包括所述标定物;The calibration system according to claim 9, further comprising the calibration object;
    所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;或者所述标定物的标定表面上设置两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。At least 3 non-collinear calibration holes are respectively provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite two surfaces correspond; or two rows of calibration holes are set on the calibration surface of the calibration object, and the calibration holes in the same row Form a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the line where the calibration holes in the same row are located is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows.
PCT/CN2022/087716 2021-07-15 2022-04-19 3d camera calibration method and apparatus, and calibration system WO2023284349A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110802360.3 2021-07-15
CN202110802360.3A CN113436277A (en) 2021-07-15 2021-07-15 3D camera calibration method, device and system

Publications (1)

Publication Number Publication Date
WO2023284349A1 true WO2023284349A1 (en) 2023-01-19

Family

ID=77760670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087716 WO2023284349A1 (en) 2021-07-15 2022-04-19 3d camera calibration method and apparatus, and calibration system

Country Status (2)

Country Link
CN (1) CN113436277A (en)
WO (1) WO2023284349A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117723551A (en) * 2024-02-18 2024-03-19 宁德时代新能源科技股份有限公司 Battery detection device, point detection method, battery production device and detection method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436277A (en) * 2021-07-15 2021-09-24 无锡先导智能装备股份有限公司 3D camera calibration method, device and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111476846A (en) * 2020-04-01 2020-07-31 苏州苏映视图像软件科技有限公司 Multi-3D camera calibration system and method
CN112180362A (en) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 Conversion pose determination method and device between radar and camera and electronic equipment
CN112308926A (en) * 2020-10-16 2021-02-02 易思维(杭州)科技有限公司 Camera external reference calibration method without public view field
CN112612016A (en) * 2020-12-07 2021-04-06 深兰人工智能(深圳)有限公司 Sensor combined calibration method and device, electronic equipment and storage medium
US20210118182A1 (en) * 2020-12-22 2021-04-22 Intel Corporation Methods and apparatus to perform multiple-camera calibration
CN112785656A (en) * 2021-01-29 2021-05-11 北京罗克维尔斯科技有限公司 Calibration method and device for double stereo cameras, electronic equipment and storage medium
CN112802124A (en) * 2021-01-29 2021-05-14 北京罗克维尔斯科技有限公司 Calibration method and device for multiple stereo cameras, electronic equipment and storage medium
CN113077521A (en) * 2021-03-19 2021-07-06 浙江华睿科技有限公司 Camera calibration method and device
CN113436277A (en) * 2021-07-15 2021-09-24 无锡先导智能装备股份有限公司 3D camera calibration method, device and system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678742B (en) * 2015-12-29 2018-05-22 哈尔滨工业大学深圳研究生院 A kind of underwater camera scaling method
CN109357633B (en) * 2018-09-30 2022-09-30 先临三维科技股份有限公司 Three-dimensional scanning method, device, storage medium and processor
CN111047510B (en) * 2019-12-17 2023-02-14 大连理工大学 Large-field-angle image real-time splicing method based on calibration
CN111965624B (en) * 2020-08-06 2024-04-09 阿波罗智联(北京)科技有限公司 Laser radar and camera calibration method, device, equipment and readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112180362A (en) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 Conversion pose determination method and device between radar and camera and electronic equipment
CN111476846A (en) * 2020-04-01 2020-07-31 苏州苏映视图像软件科技有限公司 Multi-3D camera calibration system and method
CN112308926A (en) * 2020-10-16 2021-02-02 易思维(杭州)科技有限公司 Camera external reference calibration method without public view field
CN112612016A (en) * 2020-12-07 2021-04-06 深兰人工智能(深圳)有限公司 Sensor combined calibration method and device, electronic equipment and storage medium
US20210118182A1 (en) * 2020-12-22 2021-04-22 Intel Corporation Methods and apparatus to perform multiple-camera calibration
CN112785656A (en) * 2021-01-29 2021-05-11 北京罗克维尔斯科技有限公司 Calibration method and device for double stereo cameras, electronic equipment and storage medium
CN112802124A (en) * 2021-01-29 2021-05-14 北京罗克维尔斯科技有限公司 Calibration method and device for multiple stereo cameras, electronic equipment and storage medium
CN113077521A (en) * 2021-03-19 2021-07-06 浙江华睿科技有限公司 Camera calibration method and device
CN113436277A (en) * 2021-07-15 2021-09-24 无锡先导智能装备股份有限公司 3D camera calibration method, device and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SONG DAIPING;LU LU: "Non-cooperative Circle Characteristic Pose Measurement Using Multiple Cameras without Public Field of View", INFRARED TECHNOLOGY, vol. 42, no. 1, 21 January 2020 (2020-01-21), pages 93 - 98, XP093023409, ISSN: 1001-8891 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117723551A (en) * 2024-02-18 2024-03-19 宁德时代新能源科技股份有限公司 Battery detection device, point detection method, battery production device and detection method

Also Published As

Publication number Publication date
CN113436277A (en) 2021-09-24

Similar Documents

Publication Publication Date Title
WO2023284349A1 (en) 3d camera calibration method and apparatus, and calibration system
CN106548489B (en) A kind of method for registering, the three-dimensional image acquisition apparatus of depth image and color image
Orghidan et al. Camera calibration using two or three vanishing points
US20220092819A1 (en) Method and system for calibrating extrinsic parameters between depth camera and visible light camera
CN112562014B (en) Camera calibration method, system, medium and device
WO2018201677A1 (en) Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
CN108109169B (en) Pose estimation method and device based on rectangular identifier and robot
CN109906471B (en) Real-time three-dimensional camera calibration
US20210104066A1 (en) Computer-implemented methods and system for localizing an object
US20170270654A1 (en) Camera calibration using depth data
WO2023201578A1 (en) Extrinsic parameter calibration method and device for monocular laser speckle projection system
US10252417B2 (en) Information processing apparatus, method of controlling information processing apparatus, and storage medium
Briales et al. A minimal solution for the calibration of a 2D laser-rangefinder and a camera based on scene corners
Song et al. Modeling deviations of rgb-d cameras for accurate depth map and color image registration
CN111915681B (en) External parameter calibration method, device, storage medium and equipment for multi-group 3D camera group
CN113822920B (en) Method for acquiring depth information by structured light camera, electronic equipment and storage medium
Real-Moreno et al. Camera calibration method through multivariate quadratic regression for depth estimation on a stereo vision system
Zhang et al. Camera self-calibration based on multiple view images
Cui et al. ACLC: Automatic Calibration for non-repetitive scanning LiDAR-Camera system based on point cloud noise optimization
Guillemaut et al. Using points at infinity for parameter decoupling in camera calibration
KR20200057929A (en) Method for rectification of stereo images captured by calibrated cameras and computer program
CN114648544A (en) Sub-pixel ellipse extraction method
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium
CN113052918A (en) Method, device, medium and equipment for evaluating calibration error of antipodal binocular camera
CN113793379A (en) Camera pose solving method, system, equipment and computer readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22840993

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22840993

Country of ref document: EP

Kind code of ref document: A1