WO2023284349A1 - 3d相机标定方法、装置及标定系统 - Google Patents

3d相机标定方法、装置及标定系统 Download PDF

Info

Publication number
WO2023284349A1
WO2023284349A1 PCT/CN2022/087716 CN2022087716W WO2023284349A1 WO 2023284349 A1 WO2023284349 A1 WO 2023284349A1 CN 2022087716 W CN2022087716 W CN 2022087716W WO 2023284349 A1 WO2023284349 A1 WO 2023284349A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
camera
coordinates
point cloud
scanned
Prior art date
Application number
PCT/CN2022/087716
Other languages
English (en)
French (fr)
Inventor
胡显东
缪丰
贾仕勇
Original Assignee
无锡先导智能装备股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 无锡先导智能装备股份有限公司 filed Critical 无锡先导智能装备股份有限公司
Publication of WO2023284349A1 publication Critical patent/WO2023284349A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present application relates to the technical field of machine vision, in particular to a 3D camera calibration method, device, and calibration system.
  • 3D cameras that can take stereoscopic images have appeared.
  • two 3D cameras are sometimes used for image processing after shooting.
  • the cameras need to be calibrated to obtain the coordinate transformation relationship of the two cameras.
  • a common field of view is usually required, and camera calibration is performed based on the common field of view.
  • other electronic tools such as a laser rangefinder, are usually required for calibration, and the operation is complicated.
  • a 3D camera calibration method comprising:
  • the scanned point cloud of the first camera to be calibrated and the second camera is a point cloud generated by scanning the calibration hole on the calibration object;
  • coordinate conversion relationship data between the first camera and the second camera is obtained by calculation.
  • the extracting the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain the first coordinates includes:
  • the obtaining the normal vector of the plane where the calibration hole scanned by the first camera includes:
  • At least 3 non-collinear calibration holes are provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond;
  • the scanning point cloud image of the first camera is the first A point cloud image generated by a camera scanning a calibration hole on a surface of the calibration object, and a point cloud image scanned by the second camera is a point cloud image generated by the second camera scanning a calibration hole on the other surface of the calibration object.
  • the calibration hole distance data includes the calibration object width, and the calibration object width is the width between the two opposite surfaces of the calibration object on which the calibration hole is set; according to the first coordinates, the The normal vector and the preset calibration hole distance data are used to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, including:
  • the extension coordinates of the calibration hole scanned by the second camera in the first camera are estimated.
  • two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes of the same row are in a straight line, and the positions of the calibration holes of the two rows correspond one by one, and the straight line where the calibration holes of the same row are located is the same as that of the two rows.
  • the line between the two calibration holes corresponding to the middle position is vertical;
  • the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface
  • the scanning point cloud image of the second camera is the scanning point cloud image of the calibration hole scanned by the second camera.
  • the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; Hole distance data, calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera, including:
  • the normal vector, the straight line equation and the hole spacing, the extension coordinates of the calibration holes scanned by the second camera in the first camera are calculated.
  • a 3D camera calibration device comprising:
  • the image acquisition module is used to obtain the scanned point cloud images of the first camera and the second camera to be calibrated, and the scanned point cloud images are point cloud images generated by scanning the calibration holes on the calibration object;
  • a coordinate extraction module configured to extract the center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain first coordinates, and extract the center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain the second coordinate;
  • a normal vector obtaining module configured to obtain the normal vector of the plane where the calibration hole scanned by the first camera is located
  • a coordinate mapping module configured to calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the preset calibration hole distance data
  • a relationship acquiring module configured to calculate coordinate conversion relationship data between the first camera and the second camera according to the second coordinates and the extended coordinates.
  • a calibration system comprising:
  • the first camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the first camera;
  • the second camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the second camera;
  • a controller the controller is connected to the first camera and the second camera, and includes a memory and a processor, the memory stores a computer program, and the processor implements the following steps when executing the computer program:
  • the scanned point cloud of the first camera to be calibrated and the second camera is a point cloud generated by scanning the calibration hole on the calibration object;
  • coordinate conversion relationship data between the first camera and the second camera is obtained by calculation.
  • the above-mentioned calibration system also includes the calibration object
  • At least 3 non-collinear calibration holes are provided on the opposite two surfaces of the calibration object, and the positions of the calibration holes on the opposite two surfaces correspond; or two rows of calibration holes are set on the calibration surface of the calibration object, and the calibration holes in the same row Form a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the line where the calibration holes in the same row are located is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows.
  • the above-mentioned 3D camera calibration method, device and calibration system extract the first coordinates of the calibration hole scanned by the first camera in the first camera through the scanned point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and according to the second
  • the camera scans the scanned point cloud image generated by the calibration hole on the calibration object, extracts the second coordinate of the calibration hole scanned by the second camera in the second camera, and then according to the first coordinate and the normal vector of the plane where the calibration hole scanned by the first camera
  • the calibration hole distance data calculate the extension coordinates of the calibration hole scanned by the second camera in the first camera, and finally calculate the corresponding relationship based on the second coordinates and the extension coordinates, and obtain the two coordinate systems of the first camera and the second camera Convert relational data; in this way, it can be applied to the calibration of two cameras that do not have a common field of view without relying on the common field of view. Only one calibration object can be used to calibrate two cameras simply and efficiently.
  • Fig. 1 is a schematic flow chart of a 3D camera calibration method in an embodiment
  • Fig. 2 is a schematic diagram of the situation of two cameras scanning in one embodiment
  • Fig. 3 is a schematic diagram of two points on opposite sides on the calibration object in one embodiment
  • Figure 4 is a schematic diagram of the calibration process of two cameras scanning
  • Fig. 5 is a schematic diagram of the situation of scanning stitching of two cameras in one embodiment
  • Figure 6 is a schematic diagram of the calibration process of two-camera splicing scanning
  • Fig. 7 is a structural block diagram of a 3D camera calibration device in an embodiment.
  • connection in the following embodiments should be understood as “electrical connection”, “communication connection” and the like if there is transmission of electrical signals or data between the connected objects.
  • a 3D camera calibration method is provided, which can be applied to a device used for calibration processing, such as a controller. Taking the application to the controller as an example, the method includes the following steps:
  • S110 Obtain scanned point cloud images of the first camera and the second camera to be calibrated, where the scanned point cloud images are point cloud images generated by scanning the calibration holes on the calibration object.
  • the first camera and the second camera are 3D cameras that need to be calibrated.
  • the calibration object is an object used for calibrating the first camera and the second camera, such as a wooden block, and holes are opened on the calibration object as calibration holes, and the number of calibration holes is multiple.
  • the first camera scans a part of the calibration holes on the calibration object
  • the second camera scans another part of the calibration holes on the calibration object, that is, the calibration holes scanned by the first camera and the second camera are different, and the first camera and the second camera have no public view.
  • the calibration hole may be a round hole, or a hole of other shapes.
  • the first camera and the second camera are used to scan the calibration hole on the calibration object to generate scanned point cloud images; the controller acquires the scanned point cloud images of the first camera and the scanned point cloud images of the second camera.
  • the coordinates of the center of the calibration hole are the coordinates of the center of the indexed hole.
  • the coordinates of the center of the calibration hole are the coordinates of the center of the circle.
  • the first coordinate is the coordinate value of the calibration hole scanned by the first camera in the coordinate system of the first camera
  • the second coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the second camera.
  • the first coordinate and the second coordinate are three-dimensional coordinate values.
  • the first camera scans a plurality of calibration holes, and for each calibration hole in the scanned point cloud image of the first camera, the coordinates of the center of the hole are obtained to obtain the corresponding first coordinates, that is, based on the scanned point cloud image of the first camera, it can be obtained Multiple first coordinates.
  • multiple second coordinates can be obtained based on the point cloud image scanned by the second camera.
  • the calibration object has multiple surfaces, and the calibration holes scanned by the same camera are located on the same surface of the calibration object, for example, the calibration holes scanned by the first camera are located on the same surface of the calibration object.
  • S170 Estimate the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the preset calibration hole distance data.
  • the calibration hole distance data represents the distance between the calibration hole scanned by the first camera and the calibration hole scanned by the second camera.
  • the extended coordinate refers to the coordinate value of the position of the calibration hole scanned by the second camera in the coordinate system of the first camera. According to the first coordinates, normal vector and calibration hole distance data corresponding to the first camera, the extended coordinates of the calibration hole scanned by the second camera in the first camera are calculated, that is, the calibration hole scanned by the second camera is mapped to the first Coordinate values in the camera's coordinate system.
  • S190 According to the second coordinates and the extended coordinates, calculate and obtain the coordinate transformation relationship data between the first camera and the second camera.
  • the second coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the second camera
  • the extended coordinate is the coordinate value of the calibration hole scanned by the second camera in the coordinate system of the first camera; then, the same
  • the second coordinate and the extension coordinate corresponding to a calibration hole are coordinate values in different coordinate systems.
  • the transformation relation data is data representing a transformation relation between the coordinates of the first camera and the coordinates of the second camera, and may specifically be a matrix.
  • the second coordinates and extension coordinates are three-dimensional coordinate values, and the second coordinates of each calibration hole correspond to a row of the matrix, based on the second coordinates of multiple calibration holes Obtain an original matrix; the extension coordinates of each calibration hole correspond to a row of another matrix, and obtain an extension matrix based on the extension coordinates of multiple calibration holes, and then use mathematical algorithms to calculate the transformation from the original matrix to the extension matrix or from the extension matrix Transform the data needed to transform to the original matrix to get the transformation matrix.
  • the above-mentioned 3D camera calibration method extracts the first coordinates of the calibration hole scanned by the first camera in the first camera from the scanned point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and scans the calibration hole on the calibration object according to the second camera.
  • step S130 extracting the coordinates of the center of the calibration hole in the scanning point cloud image of the first camera to obtain the first coordinates includes: extracting the edge data points of the calibration hole in the scanning point cloud image of the first camera; Data points are extracted from the center coordinates of the calibration holes to obtain the first coordinates.
  • Extract the edge data points of the calibration hole that is, determine the edge of the calibration hole.
  • the accuracy is high.
  • the gradient method is used to extract the edge data points of the calibration hole, and the edge can be accurately determined; the center point coordinates of the calibration hole can be accurately obtained by using the space center fitting algorithm to extract the center point of the calibration hole.
  • the edge data points of the circular hole are extracted using the gradient method, and the center coordinates of the circular hole are extracted using the spatial center fitting algorithm.
  • step S130 the step of extracting the hole center coordinates of the calibration hole in the scanned point cloud image of the second camera to obtain the second coordinates is the same as the specific step of extracting the hole center coordinates of the calibration hole in the scanned point cloud image of the first camera to obtain the first coordinates The same, will not be repeated here.
  • step S150 includes: selecting at least 3 non-collinear points on the surface of the calibration hole scanned by the first camera in the calibration object; determining the plane according to the selected points; calculating the normal vector of the plane .
  • the 3 non-collinear points can determine a plane, and then calculate the normal vector of the selected plane.
  • the determined at least three non-collinear points are points outside the calibration hole, which are on the same plane as the calibration hole.
  • the first camera scans the calibration hole on the left side of the calibration object, and selects three non-collinear points besides the calibration hole on the left side of the calibration object to determine the plane and obtain the normal vector.
  • the normal vector of the plane where the calibration hole is located can be extracted through a plane fitting algorithm.
  • At least 3 non-collinear calibration holes are respectively provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond;
  • the point cloud image generated by the surface calibration hole, the point cloud image scanned by the second camera is the point cloud image generated by the second camera scanning the calibration hole on the other surface of the calibration object.
  • the calibration object is a standard part, preferably an ideal cuboid.
  • the opposite sides of the calibration object include the left side and the right side, the front side and the rear side, the upper side and the lower side.
  • the positions correspond to each other.
  • the first camera scans the left side and the second camera scans the right side. In this way, this embodiment is applicable to the situation where two cameras scan each other.
  • the two cameras are set on the left and right sides of the calibration object, and on the left and right sides of the calibration object, several high-precision round holes (at least 3, and not on a line) are punched, and ensure that the planes on both sides
  • the number of round holes is equal, and the positions of the round holes correspond to each other.
  • They can be pierced.
  • the first camera (camera 1) scans each calibration hole on the left plane
  • the second camera (camera 2) scans each calibration hole on the right plane.
  • the scanning direction does not need to be parallel to the surface of the calibration object.
  • the distance data of the calibration hole includes the width of the calibration object, wherein the width of the calibration object is the width between two opposite surfaces of the calibration object on which the calibration hole is formed.
  • Step S170 includes: calculating the extension coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the width of the calibration object.
  • the second The coordinates of the calibration hole scanned by the camera in the coordinate system of the first camera.
  • the coordinates can be calculated according to a geometric algorithm.
  • the field of view of the first camera on the left side of the scanning calibration object is taken as the reference system
  • the coordinates of point A on the left side of the calibration object are known
  • the left and right sides of the calibration object The thickness d between them is known
  • the normal vector is known
  • the straight line equation passing through point A can be determined according to the normal vector and the coordinates of point A, and then according to the straight line equation, the coordinates of point A and the width of the calibration object (point A and point B The distance between) calculate the coordinates of point B on the right side to obtain the extended coordinates.
  • FIG. 4 it is a schematic diagram of the calibration process of two-camera scanning. After the calibration program is started, camera 1 and camera 2 scan the calibration object to generate a point cloud image, and then use the obtained point cloud image to fit N circles on the left plane respectively.
  • the coordinate processing in this application is completed in three-dimensional space, and is not sensitive to factors such as camera position and scanning direction, so this application has better robustness and higher precision in practical applications.
  • two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the straight line where the calibration holes are in the same row is in line with the position in the two rows.
  • the connecting line between the corresponding two calibration holes is vertical.
  • the positions of the two rows of calibration holes are opposite, one-to-one, and equal in number, that is, the first calibration hole in the first row corresponds to the position of the first calibration hole in the second row, and the second calibration hole in the first row corresponds to the position of the first calibration hole in the second row.
  • the hole corresponds to the position of the second calibration hole in the second row, and so on.
  • the corresponding two calibration holes in the two rows include the first calibration hole in the first row, the first calibration hole in the second row, and the second calibration hole in the first row.
  • Calibration hole and the second calibration hole in the second row Taking the calibration surface as an example, two rows of calibration holes can be arranged on two opposite edges of the calibration surface.
  • the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface
  • the scanning point cloud image of the second camera is generated by the second camera scanning another row of calibration holes on the calibration surface point cloud image.
  • This embodiment is applicable to the case where two cameras are spliced and scanned, that is, the two cameras scan the two sides of the same surface of the object, and the two sides cannot be directly spliced because there is no common field of view. It should be noted that this application is used for 3D cameras, and the above-mentioned splicing is not only the splicing of the left and right, but also the case that the heights may not be equal.
  • one surface of the calibration object is used as the calibration surface, and the two sides of the calibration surface are used as the area to be detected.
  • a row of calibration holes is made respectively.
  • the center of each row of calibration holes must be on the same straight line.
  • the number of calibration holes in the two rows Equal, one-to-one correspondence of positions, and it is necessary to ensure that the straight line where the same row of calibration holes is located is perpendicular to the two pairs of calibration holes corresponding to the positions in different rows. For example, as shown in Figure 5, two rows of round holes are set on the upper surface of the calibration object.
  • the first The camera scans the first row of calibration holes
  • the second camera scans the second row of calibration holes
  • the transformation matrix of the two camera coordinate systems which can be used to stitch images or other image joint processing results.
  • the splicing implementation does not require the field of view of the camera, that is, different images do not need to contain the common field of view, and the field of view of the 3D camera used only needs to cover the detection area. There is no need to select a large field of view camera to include the public field of view, which can ensure Detection accuracy.
  • the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; for example, the hole spacing can be equal to the first calibration hole in the first row and the second row.
  • Step S170 includes: according to the first coordinates of the plurality of calibration holes scanned by the first camera, fitting the straight line equation of the straight line where the calibration holes scanned by the first camera are located; according to the first coordinates, the normal vector, the straight line equation and the hole spacing, Estimate the extended coordinates of the calibration hole scanned by the second camera in the first camera.
  • the first coordinate is used to fit the straight line equation
  • the first coordinate, normal vector, straight line equation and the hole spacing between the corresponding two calibration holes are used to obtain the calibration of the second camera scan
  • the coordinates of the hole in the first camera's coordinate system can be calculated according to geometrical algorithms. For example, in Figure 5, point C is located in the first row, and point D is located in the second row.
  • the straight line passing through point D can be uniquely determined, so that it can be obtained Obtain the equation of the straight line where the two points of CD are located, and then calculate the coordinates of point D to obtain the extension coordinates according to the equation of the straight line of the two points CD and the coordinates of point C; similarly, the extension coordinates of other calibration holes in the second row can be calculated.
  • FIG. 6 it is a schematic diagram of the calibration process of spliced scanning in one embodiment.
  • steps in the flow charts of FIG. 1 , FIG. 4 , and FIG. 6 are shown sequentially as indicated by the arrows, these steps are not necessarily executed sequentially in the order indicated by the arrows. Unless otherwise specified herein, there is no strict order restriction on the execution of these steps, and these steps can be executed in other orders. Moreover, at least some of the steps in FIG. 1, FIG. 4, and FIG. 6 may include multiple steps or multiple stages, and these steps or stages are not necessarily performed at the same time, but may be performed at different times. These steps Or the execution sequence of the stages is not necessarily performed sequentially, but may be executed in turn or alternately with other steps or at least a part of steps or stages in other steps.
  • a 3D camera calibration device including: a map acquisition module 710, a coordinate extraction module 730, a normal vector acquisition module 750, a coordinate mapping module 770 and a relationship acquisition module 790, wherein :
  • the image acquisition module 710 is used to obtain the scanned point cloud images of the first camera to be calibrated and the second camera, and the scanned point cloud image is a point cloud image generated by scanning the calibration hole on the calibration object;
  • the coordinate extraction module 730 is used to extract the first camera's Scanning the center coordinates of the calibration hole in the point cloud image to obtain the first coordinates, extracting the center coordinates of the calibration hole in the scanning point cloud image of the second camera to obtain the second coordinates;
  • the coordinate mapping module 770 is used to calculate the extended coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinates, the normal vector and the preset calibration hole distance data;
  • relationship acquisition The module 790 is used to calculate the coordinate transformation relationship data between the first camera and the second camera according to the second coordinates and the extended coordinates.
  • the above-mentioned 3D camera calibration device extracts the first coordinates of the calibration hole scanned by the first camera in the first camera by scanning the point cloud image generated by scanning the calibration hole on the calibration object with the first camera, and scans the calibration hole on the calibration object with the second camera.
  • the coordinate extraction module 730 is used to extract the edge data points of the calibration hole in the scanned point cloud image of the first camera; extract the center of the calibration hole according to the edge data points of the calibration hole in the scanned point cloud image of the first camera Coordinates to obtain the first coordinates; extract the edge data points of the calibration holes in the scanned point cloud image of the second camera; extract the center coordinates of the calibration holes according to the edge data points of the calibration holes in the scanned point cloud image of the second camera to obtain the second coordinates.
  • the normal vector acquisition module 750 selects at least 3 non-collinear points on the surface where the calibration hole scanned by the first camera in the calibration object is located; the plane is determined according to the selected points; the method for obtaining the plane vector.
  • At least 3 non-collinear calibration holes are respectively provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond;
  • the point cloud image generated by the surface calibration hole, the point cloud image scanned by the second camera is the point cloud image generated by the second camera scanning the calibration hole on the other surface of the calibration object.
  • the calibration hole distance data includes the calibration object width, wherein the calibration object width is the width between two opposite surfaces of the calibration object on which the calibration hole is set.
  • the coordinate mapping module 770 calculates the extended coordinates of the calibration hole scanned by the second camera in the first camera according to the first coordinate, the normal vector and the width of the calibration object.
  • two rows of calibration holes are provided on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond one by one, and the straight line where the calibration holes are in the same row is in line with the position in the two rows.
  • the line between the corresponding two calibration holes is vertical;
  • the scanning point cloud image of the first camera is the point cloud image generated by the first camera scanning a row of calibration holes on the calibration surface, and the scanning point cloud image of the second camera is the scanning calibration image of the second camera.
  • the point cloud image generated by another row of calibration holes on the surface is vertical;
  • the calibration hole distance data includes the hole spacing between the two calibration holes corresponding to the positions in the two rows of calibration holes; the coordinate mapping module 770 fits the first coordinates of the multiple calibration holes scanned by the first camera.
  • the linear equation of the straight line where the calibration hole scanned by the camera is located; according to the first coordinate, the normal vector, the linear equation and the hole spacing, the extended coordinates of the calibration hole scanned by the second camera in the first camera are calculated.
  • Each module in the above-mentioned 3D camera calibration device can be fully or partially realized by software, hardware and a combination thereof.
  • the above-mentioned modules can be embedded in or independent of the processor in the controller in the form of hardware, and can also be stored in the memory of the controller in the form of software, so that the processor can invoke and execute the corresponding operations of the above-mentioned modules.
  • the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • a calibration system including a first camera, a second camera and a controller.
  • the first camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the first camera
  • the second camera is used to scan the calibration hole on the calibration object to generate a point cloud image to obtain the scanned point cloud image of the second camera
  • the controller is connected to the first camera and the second camera, and includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
  • the calibration holes scanned by the first camera and the second camera are different.
  • the above-mentioned calibration system includes a controller capable of implementing the steps of the above-mentioned method embodiments, similarly, it does not need to rely on the public field of view, and can simply and efficiently realize the calibration of two cameras.
  • the above-mentioned calibration system further includes a calibration object; at least three non-collinear calibration holes are provided on the opposite surfaces of the calibration object, and the positions of the calibration holes on the opposite surfaces correspond to each other.
  • the above-mentioned calibration system also includes a calibration object; two rows of calibration holes are arranged on the calibration surface of the calibration object, the calibration holes in the same row are in a straight line, and the positions of the calibration holes in the two rows correspond to each other, and the calibration holes in the same row are aligned.
  • the straight line where the hole is located is perpendicular to the line between the two calibration holes corresponding to the positions in the two rows.
  • Non-volatile memory may include read-only memory (Read-Only Memory, ROM), magnetic tape, floppy disk, flash memory or optical memory, etc.
  • Volatile memory can include Random Access Memory (RAM) or external cache memory.
  • RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请涉及一种3D相机标定方法、装置及标定系统。方法包括:获取待标定的第一相机和第二相机的扫描点云图,扫描点云图为扫描标定物上的标定孔所生成的点云图;提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标,提取第二相机的扫描点云图中标定孔的孔心坐标得到第二坐标;求取第一相机所扫描的标定孔所在平面的法向量;根据第一坐标、法向量以及预设的标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标;根据第二坐标和延伸坐标,计算得到第一相机与第二相机的坐标转换关系数据。采用本申请,不需要依赖于公共视野,可简单高效完成两相机的标定。

Description

3D相机标定方法、装置及标定系统 技术领域
本申请涉及机器视觉技术领域,特别是涉及一种3D相机标定方法、装置、及标定系统。
背景技术
随着技术的发展,出现了可以拍摄立体图像的3D相机。在一些场合中,有时采用两个3D相机拍摄后进行图像处理,这时需要对相机进行标定,以得到两个相机的坐标转换关系。
传统使用的标定方法中,通常需要有共同的视野,基于共同的视野进行相机标定。而针对没有共同视野的两个相机的标定,通常需要借助其他电子工具,比如借助激光测距仪来进行标定,操作复杂。
发明内容
基于此,有必要针对上述技术问题,提供一种不需要共同视野、简单高效的3D相机标定方法、装置及标定系统。
一种3D相机标定方法,包括:
获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;
提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;
求取所述第一相机所扫描的标定孔所在平面的法向量;
根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;
根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。
在其中一个实施例中,所述提取所述第一相机的扫描点云图中所述标定孔 的孔心坐标得到第一坐标,包括:
提取所述第一相机的扫描点云图中所述标定孔的边缘数据点;
根据所述边缘数据点提取所述标定孔的孔心坐标得到第一坐标。
在其中一个实施例中,所述求取所述第一相机所扫描的标定孔所在平面的法向量,包括:
选取所述标定物中所述第一相机所扫描的标定孔所在面上的至少3个不共线的点;
根据所选取的点确定平面;
求取所述平面的法向量。
在其中一个实施例中,所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;所述第一相机的扫描点云图为所述第一相机扫描所述标定物一表面标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定物另一表面标定孔所生成的点云图。
在其中一个实施例中,所述标定孔距离数据包括标定物宽度,所述标定物宽度为所述标定物开设标定孔的相对两表面之间的宽度;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标,包括:
根据所述第一坐标、所述法向量和所述标定物宽度,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。
在其中一个实施例中,所述标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直;
所述第一相机的扫描点云图为所述第一相机扫描所述标定表面的一排标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定表面的另一排标定孔所生成的点云图。
在其中一个实施例中,所述标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐 标,包括:
根据所述第一相机所扫描的多个标定孔的第一坐标,拟合所述第一相机所扫描的标定孔所在直线的直线方程;
根据所述第一坐标、所述法向量、所述直线方程以及所述孔间距,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。
一种3D相机标定装置,包括:
图获取模块,用于获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;
坐标提取模块,用于提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;
法向量获取模块,用于求取所述第一相机所扫描的标定孔所在平面的法向量;
坐标映射模块,用于根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;
关系获取模块,用于根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。
一种标定系统,包括:
第一相机,用于扫描标定物上的标定孔生成点云图得到所述第一相机的扫描点云图;
第二相机,用于扫描所述标定物上的标定孔生成点云图得到所述第二相机的扫描点云图;
控制器,所述控制器连接所述第一相机和所述第二相机,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现如下步骤:
获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;
提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标, 提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;
求取所述第一相机所扫描的标定孔所在平面的法向量;
根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;
根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。
在其中一个实施例中,上述标定系统还包括所述标定物;
所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;或者所述标定物的标定表面上设置两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。
上述3D相机标定方法、装置及标定系统,通过根据第一相机扫描标定物上标定孔所生成的扫描点云图,提取第一相机所扫描标定孔在第一相机中的第一坐标,根据第二相机扫描标定物上标定孔所生成的扫描点云图,提取第二相机所扫描标定孔在第二相机中的第二坐标,然后根据第一坐标、第一相机所扫描标定孔所在平面的法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,最后基于第二坐标和延伸坐标进行对应关系计算,得到第一相机和第二相机两个坐标系的转换关系数据;如此,不依赖公共视野,可以应用于不具备公共视野的两个相机的标定,仅通过一块标定物,就能简单高效的进行两个相机的标定。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为一个实施例中3D相机标定方法的流程示意图;
图2为一个实施例中两相机对扫的情况示意图;
图3为一个实施例中标定物上相对两侧的两个点的示意图;
图4为两相机对扫的标定流程示意图;
图5一个实施例中两相机扫描拼接的情况示意图;
图6为两相机拼接扫描的标定流程示意图;
图7为一个实施例中3D相机标定装置的结构框图。
具体实施方式
为了便于理解本申请,下面将参照相关附图对本申请进行更全面的描述。附图中给出了本申请的实施例。但是,本申请可以以许多不同的形式来实现,并不限于本文所描述的实施例。相反地,提供这些实施例的目的是使本申请的公开内容更加透彻全面。
除非另有定义,本文所使用的所有的技术和科学术语与属于本申请的技术领域的技术人员通常理解的含义相同。本文中在本申请的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本申请。
可以理解,本申请所使用的术语“第一”、“第二”等可在本文中用于描述各种元件,但这些元件不受这些术语限制。这些术语仅用于将第一个元件与另一个元件区分。
需要说明的是,当一个元件被认为是“连接”另一个元件时,它可以是直接连接到另一个元件,或者通过居中元件连接另一个元件。此外,以下实施例中的“连接”,如果被连接的对象之间具有电信号或数据的传递,则应理解为“电连接”、“通信连接”等。
在此使用时,单数形式的“一”、“一个”和“所述/该”也可以包括复数形式,除非上下文清楚指出另外的方式。还应当理解的是,术语“包括/包含”或“具有”等指定所陈述的特征、整体、步骤、操作、组件、部分或它们的组合的存在,但是不排除存在或添加一个或更多个其他特征、整体、步骤、操作、组件、部分或它们的组合的可能性。
在一个实施例中,如图1所示,提供了一种3D相机标定方法,可以应用于标定处理所用的器件,比如控制器。以应用于控制器为例,该方法包括如下步骤:
S110:获取待标定的第一相机和第二相机的扫描点云图,扫描点云图为扫描标定物上的标定孔所生成的点云图。
第一相机和第二相机是需要标定的3D相机。其中,标定物是标定第一相机和第二相机时所用到的物件,比如木块,标定物上开设孔作为标定孔,标定孔的数量为多个。具体地,第一相机扫描标定物上的一部分标定孔,第二相机扫描标定物上的另一部分标定孔,即第一相机和第二相机扫描的标定孔不同,第一相机和第二相机没有公共视野。具体地,标定孔可以是圆孔,也可以是其他形状的孔。标定处理之前,先采用第一相机和第二相机扫描标定物上的标定孔、各自生成扫描点云图;控制器获取第一相机的扫描点云图和第二相机的扫描点云图。
S130:提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标,提取第二相机的扫描点云图中标定孔的孔心坐标得到第二坐标。
标定孔的孔心坐标是指标定孔中心的坐标,比如以标定孔是圆孔为例,标定孔的孔心坐标即为圆心坐标。第一坐标是第一相机所扫描的标定孔在第一相机的坐标系中的坐标值,第二坐标是第二相机所扫描的标定孔在第二相机的坐标系中的坐标值。具体地,第一坐标和第二坐标为三维坐标值。
具体地,第一相机扫描多个标定孔,对于第一相机的扫描点云图中每一个标定孔,均获得孔心坐标得到对应的第一坐标,即,基于第一相机的扫描点云图可以得到多个第一坐标。同理,基于第二相机的扫描点云图可以得到多个第二坐标。
S150:求取第一相机所扫描的标定孔所在平面的法向量。
标定物具有多个表面,同一个相机所扫描的标定孔位于标定物上的同一个表面,比如,第一相机所扫描的标定孔均位于标定物的同一个面。
S170:根据第一坐标、法向量以及预设的标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。
其中,标定孔距离数据表征第一相机所扫描的标定孔与第二相机所扫描的标定孔之间的距离。延伸坐标,是指第二相机所扫描的标定孔的位置在第一相机的坐标系中的坐标值。根据第一相机对应的第一坐标、法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,即映射得到第二相机所扫描的标定孔在第一相机的坐标系中的坐标值。
S190:根据第二坐标和延伸坐标,计算得到第一相机与第二相机的坐标转换关系数据。
第二坐标是第二相机所扫描的标定孔在第二相机的坐标系中的坐标值,延伸坐标是第二相机所扫描的标定孔在第一相机的坐标系中的坐标值;则,同一个标定孔对应的第二坐标和延伸坐标是不同坐标系中的坐标值。通过根据第二坐标和延伸坐标进行对应关系计算,得到两个坐标系的转换关系数据,从而得到第一相机与第二相机的坐标系转换关系数据。
其中,转换关系数据是表征第一相机的坐标和第二相机的坐标之间的转换关系的数据,具体可以为矩阵。比如,以第二相机扫描3个以上标定孔为例,第二坐标和延伸坐标为三维坐标值,每个标定孔的第二坐标分别对应为矩阵的一行,基于多个标定孔的第二坐标得到一个原始矩阵;每个标定孔的延伸坐标分别对应为另一矩阵的一行,基于多个标定孔的延伸坐标得到一个延伸矩阵,然后采用数学算法计算从原始矩阵变换到延伸矩阵或者从延伸矩阵变换到原始矩阵所需要的数据,得到转换矩阵。
上述3D相机标定方法,通过根据第一相机扫描标定物上标定孔所生成的扫描点云图,提取第一相机所扫描标定孔在第一相机中的第一坐标,根据第二相机扫描标定物上标定孔所生成的扫描点云图,提取第二相机所扫描标定孔在第二相机中的第二坐标,然后根据第一坐标、第一相机所扫描标定孔所在平面的法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,最后基于第二坐标和延伸坐标进行对应关系计算,得到第一相机和第二相机两个坐标系的转换关系数据;如此,不依赖公共视野,可以应用于不具备公共视野的两个相机的标定,仅通过一块标定物,就能简单高效的进行两个相机的标定。
在其中一个实施例中,步骤S130中提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标,包括:提取第一相机的扫描点云图中标定孔的边缘数据点;根据边缘数据点提取标定孔的孔心坐标得到第一坐标。
提取标定孔的边缘数据点,即确定标定孔的边缘。通过先确定标定孔的边缘,再基于边缘确定标定孔的中心点的坐标,准确性高。具体地,利用梯度法提取标定孔的边缘数据点,可以准确确定边缘;利用空间圆心拟合算法提取标定孔的孔心坐标,可以准确获取中心点的坐标。例如,以标定孔是圆孔为例,利用梯度法提取圆孔的边缘数据点,利用空间圆心拟合算法提取圆孔的圆心坐标。
具体地,步骤S130中提取第二相机的扫描点云图中标定孔的孔心坐标得到第二坐标的步骤与提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标的具体步骤相同,在此不做赘述。
在其中一个实施例中,步骤S150包括:选取标定物中第一相机所扫描的标定孔所在面上的至少3个不共线的点;根据所选取的点确定平面;求取平面的法向量。
在第一相机所扫描的标定孔所在面上,选取至少3个不共线的点,3个不共线的点可以确定一个平面,然后求取选取的平面的法向量。具体地,确定的至少3个不共线的点是标定孔之外的点,与标定孔同平面。相比于直接采用标定孔确定平面,由于标定孔的坐标是通过求取孔心位置得到,有误差的可能性,单独另外取点确定平面的准确性更高。例如,第一相机扫描标定物左面的标定孔,在标定物左面除标定孔外,另外选取3个不共线的点,确定平面、求取法向量。具体地,可以通过平面拟合算法提取标定孔所在平面的法向量。
在其中一个实施例中,标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;第一相机的扫描点云图为第一相机扫描标定物一表面标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定物另一表面标定孔所生成的点云图。
标定物是一个标准件,优选地,可以是理想的长方体。以标定物为长方体为例,标定物的相对两侧包括左侧和右侧、前侧和后侧、上侧和下侧,比如, 可以是在标定物的左侧和右侧各开设至少3个不共线的标定孔,两侧标定孔数量相等、位置一一对应,第一相机扫描左侧、第二相机扫描右侧。如此,本实施例适用于两相机对扫的情况。
如图2,两相机设置在标定物左右两边,标定物的左侧以及右侧面上,打上高精度的若干圆孔(至少3个,且不在一条线上),并且保证两侧平面中的圆孔数量相等、圆孔位置一一对应。具体地,为保证圆孔的位置一一对应,可以将其打穿。第一相机(相机1)对左平面上各个标定孔进行扫描,第二相机(相机2)对右平面上各标定孔进行扫描,理论上扫描方向无需与标定物的表面平行。
在其中一个实施例中,标定孔距离数据包括标定物宽度,其中,标定物宽度为标定物开设标定孔的相对两表面之间的宽度。步骤S170包括:根据第一坐标、法向量和标定物宽度,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。
对于两相机对扫的情况,通过第一相机的坐标系中的第一坐标、第一相机所扫描平面的法向量以及标定物标定有标定孔的两表面之间的固有宽度,求得第二相机扫描的标定孔在第一相机的坐标系中的坐标。具体地,可以根据几何算法计算坐标,如图3所示,以扫描标定物左侧的第一相机的视野为参考系,标定物左侧面上的A点坐标已知,标定物左右两侧之间的厚度d已知,法向量已知,可以根据法向量和A点的坐标确定过A点的直线方程,然后根据直线方程、A点的坐标和标定物的宽度(A点与B点之间的距离)求取右侧B点的坐标得到延伸坐标。
如图4所示,为两相机对扫的标定流程示意图,标定程序启动后,相机1和相机2对标定物进行扫描生成点云图,然后使用得到的点云图,分别拟合左平面N个圆孔圆心在相机1中的三维坐标位置(圆心坐标)以及右平面对应的N个圆孔圆心在相机2中的三维坐标位置,接着利用左平面在相机1中的法向量以及标定物固有宽度,推算出对应的右平面N个圆孔圆心在相机1中的三维坐标,最后在得到右平面中N个圆孔圆心在相机2中的三维坐标后,计算出相机1以及相机2的坐标系转化关系得到转换矩阵,从而完成两个相机的标定。本申请的坐标处理均是在三维空间中完成,且对相机位置扫描方向等因素不敏感, 因此本申请在实际运用中的鲁棒性更好,精度更高。
在其中一个实施例中,标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。具体地,两排标定孔位置相对、且一一对应、数量相等,即,第一排的第一个标定孔与第二排的第一个标定孔位置对应,第一排的第二个标定孔与第二排的第二个标定孔位置对应,以此类推。以每排包括两个标定孔为例,两排中位置对应的两标定孔包括第一排中的第一个标定孔与第二排中的第一个标定孔、第一排的第二个标定孔与第二排中的第二个标定孔。以标定表面为方形为例,两排标定孔可以设置在标定表面的相对两边缘。
具体地,第一相机的扫描点云图为第一相机扫描标定表面的一排标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定表面的另一排标定孔所生成的点云图。
本实施例适用于两相机拼接扫描的情况,即两个相机扫描的是物体的同一个表面的两边区域,而这两边区域因为没有公共视野无法直接拼接。需要说明的是,本申请用于3D相机,上述的拼接不光是左右的拼接,还有高度可能不等高的情况。
本实施例中,标定物的一个表面作为标定表面,标定表面的两边作为待检测区域,各自制作一排标定孔,每排标定孔的孔心需保证在同一直线上,两排的标定孔数量相等、位置一一对应,且需要保证同一排标定孔所在直线与不同排中位置对应的标定孔两两连线垂直,例如图5所示,标定物的上表面设置两排圆孔,第一相机扫描第一排标定孔,第二相机扫描第二排标定孔,最终计算得到两个相机坐标系的转换矩阵,即可进行拼接图像或者其他图像联合处理结果。本实施例中拼接实现对相机的视野并无要求,即不需要不同的图像包含公共视野,所使用3D相机视野只需覆盖检测区域即可,无需为了包含公共视野而选择大视野相机,可以保证检测精度。
在其中一个实施例中,标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;比如,孔间距可以等于第一排的第一个标定孔与第二排 的第一个标定孔之间的间距。步骤S170包括:根据第一相机所扫描的多个标定孔的第一坐标,拟合第一相机所扫描的标定孔所在直线的直线方程;根据第一坐标、法向量、直线方程以及孔间距,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。
对于两个相机扫描用于拼接的情况,通过第一坐标拟合所在直线方程,通过第一坐标、法向量、直线方程以及对应两标定孔之间的孔间距,求得第二相机所扫描标定孔在第一相机的坐标系中的坐标。具体地,可以根据几何算法计算坐标。例如图5中,C点位于第一排,D点位于第二排,根据与C点同排的多个标定孔所在的直线和平面的法向量可以唯一确定过D点的直线,从而可以求得CD两点所在直线的方程,再根据CD两点的直线方程、C点的坐标可以计算得到D点的坐标得到延伸坐标;同理,可以计算得到第二排中其他标定孔的延伸坐标。如图6所示,为一个实施例中拼接扫描的标定流程示意图。
应该理解的是,虽然图1、图4、图6的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图1、图4、图6中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。
在一个实施例中,如图7所示,提供了一种3D相机标定装置,包括:图获取模块710、坐标提取模块730、法向量获取模块750、坐标映射模块770和关系获取模块790,其中:
图获取模块710用于获取待标定的第一相机和第二相机的扫描点云图,扫描点云图为扫描标定物上的标定孔所生成的点云图;坐标提取模块730用于提取第一相机的扫描点云图中标定孔的孔心坐标得到第一坐标,提取第二相机的扫描点云图中标定孔的孔心坐标得到第二坐标;法向量获取模块750用于求取 第一相机所扫描的标定孔所在平面的法向量;坐标映射模块770用于根据第一坐标、法向量以及预设的标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标;关系获取模块790用于根据第二坐标和延伸坐标,计算得到第一相机与第二相机的坐标转换关系数据。
上述3D相机标定装置,通过根据第一相机扫描标定物上标定孔所生成的扫描点云图,提取第一相机所扫描标定孔在第一相机中的第一坐标,根据第二相机扫描标定物上标定孔所生成的扫描点云图,提取第二相机所扫描标定孔在第二相机中的第二坐标,然后根据第一坐标、第一相机所扫描标定孔所在平面的法向量以及标定孔距离数据,推算第二相机所扫描的标定孔在第一相机中的延伸坐标,最后基于第二坐标和延伸坐标进行对应关系计算,得到第一相机和第二相机两个坐标系的转换关系数据;如此,不依赖公共视野,可以应用于不具备公共视野的两个相机的标定,仅通过一块标定物,就能简单高效的进行两个相机的标定。
在其中一个实施例中,坐标提取模块730用于提取第一相机的扫描点云图中标定孔的边缘数据点;根据第一相机的扫描点云图中标定孔的边缘数据点提取标定孔的孔心坐标得到第一坐标;提取第二相机的扫描点云图中标定孔的边缘数据点;根据第二相机的扫描点云图中标定孔的边缘数据点提取标定孔的孔心坐标得到第二坐标。
在其中一个实施例中,法向量获取模块750选取标定物中第一相机所扫描的标定孔所在面上的至少3个不共线的点;根据所选取的点确定平面;求取平面的法向量。
在其中一个实施例中,标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;第一相机的扫描点云图为第一相机扫描标定物一表面标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定物另一表面标定孔所生成的点云图。
对应地,标定孔距离数据包括标定物宽度,其中,标定物宽度为标定物开设标定孔的相对两表面之间的宽度。坐标映射模块770根据第一坐标、法向量和标定物宽度,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。
在其中一个实施例中,标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直;第一相机的扫描点云图为第一相机扫描标定表面的一排标定孔所生成的点云图,第二相机的扫描点云图为第二相机扫描标定表面的另一排标定孔所生成的点云图。
对应地,标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;坐标映射模块770根据第一相机所扫描的多个标定孔的第一坐标,拟合第一相机所扫描的标定孔所在直线的直线方程;根据第一坐标、法向量、直线方程以及孔间距,推算第二相机所扫描的标定孔在第一相机中的延伸坐标。
关于3D相机标定装置的具体限定可以参见上文中对于3D相机标定方法的限定,在此不再赘述。上述3D相机标定装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于控制器中的处理器中,也可以以软件形式存储于控制器中的存储器中,以便于处理器调用执行以上各个模块对应的操作。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在一个实施例中,提供了一种标定系统,包括第一相机、第二相机和控制器。其中,第一相机用于扫描标定物上的标定孔生成点云图得到第一相机的扫描点云图;第二相机用于扫描标定物上的标定孔生成点云图得到第二相机的扫描点云图;控制器,连接第一相机和第二相机,包括存储器和处理器,存储器存储有计算机程序,处理器执行计算机程序时实现上述各方法实施例的步骤。具体地,第一相机和第二相机扫描的标定孔不同。
上述标定系统,由于包括了可以实现上述各方法实施例的步骤的控制器,同理,不需要依赖于公共视野,可以简单高效的实现两个相机的标定。
在其中一个实施例中,上述标定系统还包括标定物;标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应。
在其中一个实施例中,上述标定系统还包括标定物;标定物的标定表面上设置两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。
关于标定系统中控制器的处理器所执行的计算机程序,具体限定可以参见上文中对于3D相机标定方法的限定,在此不再赘述。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和易失性存储器中的至少一种。非易失性存储器可包括只读存储器(Read-Only Memory,ROM)、磁带、软盘、闪存或光存储器等。易失性存储器可包括随机存取存储器(Random Access Memory,RAM)或外部高速缓冲存储器。作为说明而非局限,RAM可以是多种形式,比如静态随机存取存储器(Static Random Access Memory,SRAM)或动态随机存取存储器(Dynamic Random Access Memory,DRAM)等。
在本说明书的描述中,参考术语“有些实施例”、“其他实施例”、“理想实施例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特征包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性描述不一定指的是相同的实施例或示例。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (10)

  1. 一种3D相机标定方法,其特征在于,包括:
    获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;
    提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;
    求取所述第一相机所扫描的标定孔所在平面的法向量;
    根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;
    根据所述第二坐标和所述延伸坐标,计算得到所述第一相机与所述第二相机的坐标转换关系数据。
  2. 根据权利要求1所述的方法,其特征在于,所述提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,包括:
    提取所述第一相机的扫描点云图中所述标定孔的边缘数据点;
    根据所述边缘数据点提取所述标定孔的孔心坐标得到第一坐标。
  3. 根据权利要求1所述的方法,其特征在于,所述求取所述第一相机所扫描的标定孔所在平面的法向量,包括:
    选取所述标定物中所述第一相机所扫描的标定孔所在面上的至少3个不共线的点;
    根据所选取的点确定平面;
    求取所述平面的法向量。
  4. 根据权利要求1-3中任意一项所述的方法,其特征在于,所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;所述第一相机的扫描点云图为所述第一相机扫描所述标定物一表面标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定物另一表面标定孔所生成的点云图。
  5. 根据权利要求4所述的方法,其特征在于,所述标定孔距离数据包括标定物宽度,所述标定物宽度为所述标定物开设标定孔的相对两表面之间的宽度;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第 二相机所扫描的标定孔在所述第一相机中的延伸坐标,包括:
    根据所述第一坐标、所述法向量和所述标定物宽度,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。
  6. 根据权利要求1-3中任意一项所述的方法,其特征在于,所述标定物的标定表面上开设两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直;
    所述第一相机的扫描点云图为所述第一相机扫描所述标定表面的一排标定孔所生成的点云图,所述第二相机的扫描点云图为所述第二相机扫描所述标定表面的另一排标定孔所生成的点云图。
  7. 根据权利要求6所述的方法,其特征在于,所述标定孔距离数据包括两排标定孔中位置对应的两个标定孔之间的孔间距;所述根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标,包括:
    根据所述第一相机所扫描的多个标定孔的第一坐标,拟合所述第一相机所扫描的标定孔所在直线的直线方程;
    根据所述第一坐标、所述法向量、所述直线方程以及所述孔间距,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标。
  8. 一种3D相机标定装置,其特征在于,包括:
    图获取模块,用于获取待标定的第一相机和第二相机的扫描点云图,所述扫描点云图为扫描标定物上的标定孔所生成的点云图;
    坐标提取模块,用于提取所述第一相机的扫描点云图中所述标定孔的孔心坐标得到第一坐标,提取所述第二相机的扫描点云图中所述标定孔的孔心坐标得到第二坐标;
    法向量获取模块,用于求取所述第一相机所扫描的标定孔所在平面的法向量;
    坐标映射模块,用于根据所述第一坐标、所述法向量以及预设的标定孔距离数据,推算所述第二相机所扫描的标定孔在所述第一相机中的延伸坐标;
    关系获取模块,用于根据所述第二坐标和所述延伸坐标,计算得到所述第 一相机与所述第二相机的坐标转换关系数据。
  9. 一种标定系统,其特征在于,包括:
    第一相机,用于扫描标定物上的标定孔生成点云图得到所述第一相机的扫描点云图;
    第二相机,用于扫描所述标定物上的标定孔生成点云图得到所述第二相机的扫描点云图;
    控制器,所述控制器连接所述第一相机和所述第二相机,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现权利要求1-7中任意一项所述方法的步骤。
  10. 根据权利要求9所述的标定系统,其特征在于,还包括所述标定物;
    所述标定物的相对两表面分别开设至少3个不共线的标定孔,且相对两表面的标定孔位置对应;或者所述标定物的标定表面上设置两排标定孔,同排的标定孔成一条直线,且两排的标定孔位置一一对应、同排标定孔所在直线与两排中位置对应的两标定孔之间的连线垂直。
PCT/CN2022/087716 2021-07-15 2022-04-19 3d相机标定方法、装置及标定系统 WO2023284349A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110802360.3 2021-07-15
CN202110802360.3A CN113436277A (zh) 2021-07-15 2021-07-15 3d相机标定方法、装置及标定系统

Publications (1)

Publication Number Publication Date
WO2023284349A1 true WO2023284349A1 (zh) 2023-01-19

Family

ID=77760670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087716 WO2023284349A1 (zh) 2021-07-15 2022-04-19 3d相机标定方法、装置及标定系统

Country Status (2)

Country Link
CN (1) CN113436277A (zh)
WO (1) WO2023284349A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117723551A (zh) * 2024-02-18 2024-03-19 宁德时代新能源科技股份有限公司 电池检测设备、点检方法、电池生产设备和检测方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436277A (zh) * 2021-07-15 2021-09-24 无锡先导智能装备股份有限公司 3d相机标定方法、装置及标定系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111476846A (zh) * 2020-04-01 2020-07-31 苏州苏映视图像软件科技有限公司 一种多3d相机标定系统及方法
CN112180362A (zh) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 雷达与相机之间的转换位姿确定方法、装置以及电子设备
CN112308926A (zh) * 2020-10-16 2021-02-02 易思维(杭州)科技有限公司 一种无公共视场的相机外参标定方法
CN112612016A (zh) * 2020-12-07 2021-04-06 深兰人工智能(深圳)有限公司 传感器联合标定方法、装置、电子设备及存储介质
US20210118182A1 (en) * 2020-12-22 2021-04-22 Intel Corporation Methods and apparatus to perform multiple-camera calibration
CN112785656A (zh) * 2021-01-29 2021-05-11 北京罗克维尔斯科技有限公司 双立体相机的标定方法及装置、电子设备及存储介质
CN112802124A (zh) * 2021-01-29 2021-05-14 北京罗克维尔斯科技有限公司 多台立体相机的标定方法及装置、电子设备及存储介质
CN113077521A (zh) * 2021-03-19 2021-07-06 浙江华睿科技有限公司 一种相机标定方法及装置
CN113436277A (zh) * 2021-07-15 2021-09-24 无锡先导智能装备股份有限公司 3d相机标定方法、装置及标定系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678742B (zh) * 2015-12-29 2018-05-22 哈尔滨工业大学深圳研究生院 一种水下相机标定方法
CN109357633B (zh) * 2018-09-30 2022-09-30 先临三维科技股份有限公司 三维扫描方法、装置、存储介质和处理器
CN111047510B (zh) * 2019-12-17 2023-02-14 大连理工大学 一种基于标定的大视场角图像实时拼接方法
CN111965624B (zh) * 2020-08-06 2024-04-09 阿波罗智联(北京)科技有限公司 激光雷达与相机的标定方法、装置、设备和可读存储介质

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112180362A (zh) * 2019-07-05 2021-01-05 北京地平线机器人技术研发有限公司 雷达与相机之间的转换位姿确定方法、装置以及电子设备
CN111476846A (zh) * 2020-04-01 2020-07-31 苏州苏映视图像软件科技有限公司 一种多3d相机标定系统及方法
CN112308926A (zh) * 2020-10-16 2021-02-02 易思维(杭州)科技有限公司 一种无公共视场的相机外参标定方法
CN112612016A (zh) * 2020-12-07 2021-04-06 深兰人工智能(深圳)有限公司 传感器联合标定方法、装置、电子设备及存储介质
US20210118182A1 (en) * 2020-12-22 2021-04-22 Intel Corporation Methods and apparatus to perform multiple-camera calibration
CN112785656A (zh) * 2021-01-29 2021-05-11 北京罗克维尔斯科技有限公司 双立体相机的标定方法及装置、电子设备及存储介质
CN112802124A (zh) * 2021-01-29 2021-05-14 北京罗克维尔斯科技有限公司 多台立体相机的标定方法及装置、电子设备及存储介质
CN113077521A (zh) * 2021-03-19 2021-07-06 浙江华睿科技有限公司 一种相机标定方法及装置
CN113436277A (zh) * 2021-07-15 2021-09-24 无锡先导智能装备股份有限公司 3d相机标定方法、装置及标定系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SONG DAIPING;LU LU: "Non-cooperative Circle Characteristic Pose Measurement Using Multiple Cameras without Public Field of View", INFRARED TECHNOLOGY, vol. 42, no. 1, 21 January 2020 (2020-01-21), pages 93 - 98, XP093023409, ISSN: 1001-8891 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117723551A (zh) * 2024-02-18 2024-03-19 宁德时代新能源科技股份有限公司 电池检测设备、点检方法、电池生产设备和检测方法

Also Published As

Publication number Publication date
CN113436277A (zh) 2021-09-24

Similar Documents

Publication Publication Date Title
WO2023284349A1 (zh) 3d相机标定方法、装置及标定系统
US20220092819A1 (en) Method and system for calibrating extrinsic parameters between depth camera and visible light camera
US9866818B2 (en) Image processing apparatus and method, image processing system and program
Orghidan et al. Camera calibration using two or three vanishing points
WO2018201677A1 (zh) 基于光束平差的远心镜头三维成像系统的标定方法及装置
CN108109169B (zh) 一种基于矩形标识的位姿估计方法、装置及机器人
US20210104066A1 (en) Computer-implemented methods and system for localizing an object
KR20020035652A (ko) 3 차원적인 복구를 위한 스트랩다운 시스템
TW201622419A (zh) 相機校準
WO2023201578A1 (zh) 单目激光散斑投影系统的外参数标定方法和装置
Fathi et al. Multistep explicit stereo camera calibration approach to improve Euclidean accuracy of large-scale 3D reconstruction
CN111145271A (zh) 相机参数的精确度的确定方法、装置、存储介质及终端
CN114299156A (zh) 无重叠区域下多相机的标定与坐标统一方法
Song et al. Modeling deviations of rgb-d cameras for accurate depth map and color image registration
Briales et al. A minimal solution for the calibration of a 2D laser-rangefinder and a camera based on scene corners
CN111915681B (zh) 多组3d相机群的外参标定方法、装置、存储介质及设备
US10252417B2 (en) Information processing apparatus, method of controlling information processing apparatus, and storage medium
CN113822920B (zh) 结构光相机获取深度信息的方法、电子设备及存储介质
Zhang et al. Camera self-calibration based on multiple view images
KR20200057929A (ko) 캘리브레이트된 카메라들에 의해 캡쳐된 스테레오 영상들의 렉티피케이션 방법과 컴퓨터 프로그램
CN114648544A (zh) 一种亚像素椭圆提取方法
Zhou et al. A new algorithm for computing the projection matrix between a LIDAR and a camera based on line correspondences
CN115018922A (zh) 畸变参数标定方法、电子设备和计算机可读存储介质
Cui et al. ACLC: Automatic Calibration for non-repetitive scanning LiDAR-Camera system based on point cloud noise optimization
CN113793379A (zh) 相机姿态求解方法及系统、设备和计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22840993

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE