WO2023201578A1 - 单目激光散斑投影系统的外参数标定方法和装置 - Google Patents

单目激光散斑投影系统的外参数标定方法和装置 Download PDF

Info

Publication number
WO2023201578A1
WO2023201578A1 PCT/CN2022/087970 CN2022087970W WO2023201578A1 WO 2023201578 A1 WO2023201578 A1 WO 2023201578A1 CN 2022087970 W CN2022087970 W CN 2022087970W WO 2023201578 A1 WO2023201578 A1 WO 2023201578A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
speckle
laser speckle
image
projector
Prior art date
Application number
PCT/CN2022/087970
Other languages
English (en)
French (fr)
Inventor
张跃强
蒋卓灿
王骞鹏
Original Assignee
深圳大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳大学 filed Critical 深圳大学
Priority to PCT/CN2022/087970 priority Critical patent/WO2023201578A1/zh
Publication of WO2023201578A1 publication Critical patent/WO2023201578A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present application relates to the field of computer vision, and in particular to an external parameter calibration method and device for a monocular laser speckle projection system.
  • High-precision depth measurement is one of the important research topics in the field of computer vision.
  • Traditional depth measurement methods mainly include Time-of-Flight (ToF for short) and binocular stereo vision methods.
  • ToF obtains the depth information of the target by measuring the flight time or phase transformation of modulated light. It has the advantages of being less affected by ambient light, fast measurement speed, and can achieve long-distance measurement.
  • the measurement accuracy of ToF can only reach the centimeter level, which cannot meet the needs of some high-precision measurement tasks.
  • the binocular stereo vision method obtains the disparity map of the target area by matching image pairs captured by two cameras at different positions, thereby obtaining the depth information of the target to be measured.
  • This method typically uses block matching or semi-global matching algorithms to search for similar regions of image pairs, enabling sub-pixel level matching accuracy. Since the binocular stereo vision method performs image matching based on visual features, it will be difficult to match scenes with obvious changes in ambient light or lack of texture features, resulting in large matching errors or even matching failures. In addition, the huge amount of calculation caused by the image feature extraction and matching process limits its application in the field of real-time measurement.
  • the laser in the laser speckle projector emits infrared laser light, which passes through a diffraction grating (frosted glass) to form a highly random speckle image.
  • the system uses an infrared camera to capture speckle images of the target surface to be measured, which greatly reduces the impact of ambient light on the measurement.
  • the measurement time of the laser speckle projection system can be shortened to a single exposure time, thereby achieving real-time dynamic measurement.
  • the speckle projection system According to the number of cameras in the speckle projection system, it can be divided into two categories: binocular speckle system and monocular speckle system.
  • the binocular laser speckle projection system is equivalent to a binocular stereo vision system with speckle images.
  • the highly random speckle image endows the textureless area with rich feature information, which significantly improves the image matching accuracy and measurement accuracy of the binocular stereo vision system.
  • the manufacturing cost of the binocular speckle system is high, and the system calibration steps are complicated.
  • the monocular laser speckle projection system only contains an infrared camera and a laser speckle projector, making the system more compact and lower in cost.
  • the speckle projection system requires the use of high-precision ranging instruments to capture corresponding speckle images at different standard distances in advance, which makes the external parameter calibration of the monocular laser speckle projection system complex and cumbersome.
  • embodiments of the present application provide an external parameter calibration method and device for a monocular laser speckle projection system to solve the problem in the prior art that the external parameter calibration of a monocular laser speckle projection system is too complex and cumbersome. .
  • embodiments of the present application provide a method for calibrating external parameters of a monocular laser speckle projection system.
  • the method includes:
  • a calibration plate is prepared to collect speckle images under the monocular laser speckle projection system, wherein the calibration plate is provided with at least three landmark features;
  • an implementation is further provided, in which the sign feature is a diagonal sign, and the calculation of the spatial plane equation of the calibration plate includes:
  • a corresponding world coordinate system is established on the calibration plate, with the diagonal mark at the upper left as the origin of the coordinate system, the x-axis passes vertically downward through the diagonal mark at the lower left, and the y-axis passes horizontally to the right through the upper right
  • the diagonal mark is square, and the z-axis is perpendicular to the calibration plate and outward, where the coordinates of the diagonal mark in the world coordinate system are expressed as (X w , Y w , 1), and the diagonal mark is The relationship between the pixel coordinates (u, v) of the mark and the three-dimensional coordinates (X w , Y w , 1) of the world coordinate system is expressed as:
  • K c is the internal parameter matrix of the camera
  • s is the scale coefficient
  • R is the rotation matrix
  • T is the translation vector
  • X c ,Y c ,Z c the coordinates of the diagonal mark on the surface of the calibration plate in the camera coordinate system
  • the least squares method is used to fit the space plane equation of the calibration plate in the camera coordinate system.
  • the calculation of the three-dimensional coordinates of the speckle image point with the same name includes:
  • the equivalent three-dimensional coordinates (X c , Y c , Z c ) of the speckle image point in the camera coordinate system are calculated as:
  • (C x , C y ) are the principal point coordinates of the camera, d x and d y are the physical dimensions of the pixel in the x-axis and y-axis directions respectively, and f is the camera focal length;
  • the spatial straight line equation passing through the speckle image point and the origin of the camera coordinate system can be expressed as:
  • the three-dimensional coordinates of the speckle image points with the same name are calculated, and a three-dimensional coordinate set corresponding to the speckle image points with the same name is obtained.
  • Calculating the pose relationship between the camera and the laser speckle projector includes:
  • V c [v x , v y , v z ] T , which is expressed as V in the laser speckle projector coordinate system.
  • p [0,0,1] T , the relationship between the two can be expressed as:
  • the coordinates of the optical center of the laser speckle projector in the camera coordinate system are (x p , y p , z p ), and the coordinates in the laser speckle projector coordinate system are (0,0,0) , the relationship between the two can be expressed as:
  • calculating the plane homography matrix between the camera image and the laser speckle projector virtual image according to the pose relationship includes:
  • the coordinates of the speckle image point in the camera coordinate system satisfy the following equation:
  • n is the normalized normal vector of the calibration plate
  • d is the distance from the origin of the camera coordinate system to the calibration plate
  • Generating a virtual image of the laser speckle projector includes:
  • the camera image is mapped to the virtual projector plane to obtain the laser speckle projector virtual image.
  • an implementation is further provided, after generating the virtual image of the laser speckle projector, using the laser speckle projector as a second camera,
  • the camera in the monocular laser speckle projection system forms a binocular camera system.
  • inventions of the present application provide an external parameter calibration device for a monocular laser speckle projection system.
  • the device includes:
  • a camera calibration image acquisition module used to collect camera calibration images under a monocular laser speckle projection system, which includes a camera and a laser speckle projector;
  • a camera parameter calibration module used to calibrate camera parameters according to the camera calibration image
  • a speckle image acquisition module used to prepare a calibration plate and collect speckle images under the monocular laser speckle projection system, wherein the calibration plate is provided with at least three landmark features;
  • a spatial plane equation calculation module configured to extract the landmark feature of the speckle image according to the camera parameters, and calculate the spatial plane equation of the calibration plate;
  • a speckle image point acquisition module with the same name is used to acquire speckle image points with the same name according to the speckle image
  • a speckle image point calculation module with the same name is used to calculate the three-dimensional coordinates of the speckle image point with the same name according to the spatial plane equation of the calibration plate;
  • An optical center and optical axis estimation module is used to estimate the optical center and optical axis position of the laser speckle projector based on the three-dimensional coordinates of the speckle image point with the same name;
  • a pose relationship calculation module configured to calculate the pose relationship between the camera and the laser speckle projector according to the preset coordinate system of the laser speckle projector
  • a virtual speckle image generation module is configured to calculate a plane homography matrix between the camera image and the laser speckle projector virtual image according to the pose relationship, and generate the laser speckle projector virtual image.
  • the present application provides a computer device, including a memory, a processor, and computer-readable instructions stored in the memory and executable on the processor.
  • the processor executes the computer-readable instructions.
  • the present application provides a computer-readable storage medium that stores computer-readable instructions.
  • the computer-readable instructions When executed by a processor, the computer-readable instructions implement any one of the first aspects.
  • this application uses a calibration plate to calculate the space plane equation, and calculates the three-dimensional coordinates of the speckle image point with the same name based on the space plane equation; and then estimates the laser speckle projector based on the three-dimensional coordinates of the speckle image point with the same name.
  • This application does not require the use of precise ranging devices to capture corresponding speckle images at different standard distances, which can significantly improve the measurement efficiency of external parameter calibration and significantly improve the measurement accuracy.
  • Figure 1 is a flow chart of an external parameter calibration method of a monocular laser speckle projection system in an embodiment of the present application
  • Figure 2 is a schematic diagram of a calibration plate in an embodiment of the present application.
  • Figure 3 is a schematic diagram of the optical axis and optical center of a laser speckle projector in an embodiment of the present application.
  • first, second, third, etc. may be used to describe the preset range and the like in the embodiments of the present application, these preset ranges should not be limited to these terms. These terms are only used to distinguish preset ranges from each other.
  • the first preset range may also be called the second preset range, and similarly, the second preset range may also be called the first preset range.
  • the word “if” as used herein may be interpreted as “when” or “when” or “in response to determination” or “in response to detection.”
  • the phrase “if determined” or “if (stated condition or event) is detected” may be interpreted as “when determined” or “in response to determining” or “when (stated condition or event) is detected )” or “in response to detecting (a stated condition or event)”.
  • Figure 1 is a flow chart of an external parameter calibration method of a monocular laser speckle projection system in an embodiment of the present application. As shown in Figure 1, the external parameter calibration method of the monocular laser speckle projection system specifically includes the following steps:
  • S10 Collect camera calibration images under a monocular laser speckle projection system, which includes a camera and a laser speckle projector.
  • an infrared camera and a laser speckle projector are fixed on a tripod at an appropriate angle to form a monocular laser speckle projection system. After that, select a checkerboard calibration plate of appropriate size and place it within the camera's field of view, open the camera, adjust the focus, and capture the image of the checkerboard calibration plate. During this period, the position and posture of the checkerboard calibration plate need to be continuously adjusted.
  • the Zhang Zhengyou calibration method is used to calculate the intrinsic parameters of the camera, including focal length and principal point coordinates, as well as radial distortion and tangential distortion.
  • a calibration plate is produced, and speckle images are collected under a monocular laser speckle projection system.
  • the calibration plate is provided with at least three landmark features.
  • the calibration plate includes at least three marking features, which are arranged on the plane of the calibration plate in the form of markings or specific features. This application does not limit the characteristics of the sign.
  • diagonal marks can be used to provide mark features on the calibration plate that can be used to calculate the spatial plane equation of the calibration plate.
  • the number of marking features on the calibration plate can be set at three or more, and the more marking features are set (to facilitate calculation and verification), the more helpful it is to improve the accuracy of the external parameter calibration of the monocular laser speckle projection system. To improve the measurement accuracy of the monocular laser speckle projection system.
  • a piece of calibration paper with diagonal marks on each of the four corners and blank diagonal marks is printed, and the diagonal mark calibration paper is placed closely on the flat plate to obtain a calibration plate with diagonal marks.
  • the marking feature in the calibration plate may be a diagonal mark, that is, a mark placed at a diagonal position of the calibration plate.
  • the distortion of the speckle image can be corrected based on the camera distortion coefficient (including radial distortion and tangential distortion) obtained in S20.
  • the Harris corner point detection algorithm is used to identify the diagonal marks in the speckle image and extract the diagonal marks, so as to calculate the spatial plane equation of the calibration plate based on the diagonal marks.
  • FIG. 2 is a schematic diagram of a calibration plate in an embodiment of the present application. As can be seen from Figure 2, there are a pair of diagonal marks on each of the four corners of the calibration plate. The world coordinate system established based on the diagonal marks is shown in Figure 2 Each coordinate axis is shown.
  • the coordinates of the diagonal marks in the world coordinate system are known, and the coordinates of the diagonal marks in the world coordinate system are expressed as (X w , Y w , 1), the relationship between the pixel coordinates (u, v) of the diagonal mark and the three-dimensional coordinates (X w , Y w ,1) of the world coordinate system is expressed as:
  • K c is the internal parameter matrix of the camera
  • s is the scale coefficient
  • R is the rotation matrix
  • T is the translation vector
  • R and T describe the pose relationship between the camera coordinate system and the world coordinate system; and it consists of 4 sets of pairs
  • the coordinate pair of the corner marks is enough to calculate R and T, and the coordinates of the diagonal marks on the surface of the calibration plate in the camera coordinate system (X c , Y c , Z c ) can also be obtained:
  • the first speckle image can be used as a reference image for speckle matching, and a digital image correlation method is used to determine the shape of the speckle image point by solving a displacement shape function including first-order and second-order displacement gradient parameters. The best matching position is obtained to obtain the speckle image point set of the same name in the speckle image.
  • the light entering the camera must pass through the optical center of the camera, which is the origin of the camera coordinate system.
  • the speckle image with the same name can be calculated based on the space plane equation.
  • the three-dimensional coordinates of the point It can be understood that the speckle image point with the same name is the three-dimensional coordinate on the calibration plate plane, and the three-dimensional coordinate is the intersection point of the speckle ray emitted by the laser speckle projector and the calibration plate plane.
  • the equivalent three-dimensional coordinates (X c , Y c , Z c ) of the speckle image point in the camera coordinate system are calculated as:
  • C x , C y are the principal point coordinates of the camera, d x and d y are the physical dimensions of the pixel in the x-axis and y-axis directions respectively, and f is the camera focal length; through the speckle image point and the camera coordinate system
  • the space straight line equation of the origin can be expressed as:
  • the three-dimensional coordinates of the speckle image point with the same name are calculated, and the three-dimensional coordinate set corresponding to the speckle image point with the same name is obtained.
  • S70 Estimate the optical center and optical axis position of the laser speckle projector based on the three-dimensional coordinates of the speckle image point with the same name.
  • the optical axis of the laser speckle projector is designed to be strictly perpendicular to the diffraction grating and pass through its center position. Therefore, the center points of the speckle images on the calibration plates at different positions and postures are all located on the optical axis of the laser speckle projector or in the vicinity. According to the three-dimensional coordinate set corresponding to the center point of the speckle image obtained in step S60, the optical axis position of the laser speckle projector can be determined through straight line fitting.
  • the straight line fitted by the speckle spots with the same name corresponds to the light emitted by the laser speckle projector, where the light source point passing through the laser is the optical center of the laser speckle projector. Understandably, under the influence of factors such as camera calibration error, image matching error, and fitting error, the set of fitted straight lines will not intersect at one point, but will shift to varying degrees. Therefore, the spatial point closest to the fitted straight line set is calculated and used as the optimal projector optical center.
  • Figure 3 is a schematic diagram of the optical axis and optical center of a laser speckle projector in an embodiment of the present application. From Figure 3, we can see the physical spatial relationship between the speckle spots of the same name on the calibration board, the optical axis and the (laser speckle) projector.
  • S80 Calculate the pose relationship between the camera and the laser speckle projector based on the preset coordinate system of the laser speckle projector.
  • a laser speckle projector coordinate system can be established based on the obtained optical center and optical axis position of the laser speckle projector, and the distance between the camera and the laser speckle projector can be calculated in the laser speckle projector coordinate system. posture relationship.
  • the coordinate system of the laser speckle projector takes the optical center of the laser speckle projector as the origin, the z-axis coincides with the optical axis of the laser speckle projector, and the direction facing the target is the positive direction.
  • a x , A y , and A z are the Euler angles of the rotation matrix. It should be noted that the rotation sequence of the Euler angles can be the rotation sequence of xyz, and Euler angles of other rotation sequences can also be used.
  • a x and A y can be calculated:
  • a z determines the direction of the x and y axes of the laser speckle projector coordinate system, as well as the speckle coordinates of the speckle image. Since the optical center and optical axis position of the laser speckle projector have been determined, the absolute physical position of the speckle image in the laser speckle projector is fixed and has nothing to do with the Euler angle. Under the condition of ensuring that most or the entire speckle image can be in the virtual image of the laser speckle projector, select an appropriate A z value and calculate the rotation matrix R.
  • the coordinates of the optical center of the laser speckle projector in the camera coordinate system are (x p , y p , z p ), and the coordinates in the laser speckle projector coordinate system are (0,0,0).
  • the relationship can be expressed as:
  • S90 Calculate the plane homography matrix between the camera image and the laser speckle projector virtual image based on the pose relationship, and generate the laser speckle projector virtual image.
  • the coordinates of the speckle image point in the camera coordinate system are (x 1 , y 1 , z 1 ), and the coordinates of the laser speckle projector coordinate system are (x 2 , y 2 , z 2 ).
  • the relationship between can be expressed as:
  • R and T are the rotation matrix and translation vector calculated by S80 respectively. They describe the pose relationship between the camera coordinate system and the laser speckle projector coordinate system. Since the speckle image points are located on the surface of the calibration plate, the coordinates of the speckle image points in the camera coordinate system satisfy the following equation:
  • n is the normalized normal vector of the calibration plate
  • d is the distance from the origin of the camera coordinate system to the calibration plate
  • a calibration plate is placed on the target to be measured, the laser speckle projector projects the speckle image onto the surface of the calibration plate, and the camera captures the corresponding speckle image.
  • the plane homography matrix H can be obtained. Understandably, with the planar homography matrix H, the image captured by the camera can be converted to the laser speckle projector, so that the monocular laser speckle projection system of the present application has the same binocular stereoscopic vision capability as the binocular camera system. , which can complete the calibration of external parameters conveniently and efficiently.
  • step S90 that is, the step of generating a virtual image of the laser speckle projector, it specifically includes: mapping the camera image to the virtual image according to the plane homography matrix between the camera image and the virtual image of the laser speckle projector.
  • the projector plane obtains a virtual image of the laser speckle projector.
  • the plane homography matrix describes the mapping relationship between points on the same plane between different images.
  • the plane may be the plane where the calibration plate is located or the plane where the corresponding image of the target to be measured is located. It can be understood that when a binocular camera is used for shooting, the two cameras will obtain images from different shooting angles on the plane, and the plane homography matrix describes the mapping relationship between the images from different shooting angles.
  • this application establishes the connection between the camera and the laser speckle projector with the help of a spatial plane, and restores the virtual speckle image corresponding to the laser speckle projector through a planar homography matrix.
  • the monocular laser speckle projection system in this application is equivalent to a binocular camera system, and can use binocular stereo vision to compare the camera and the laser speckle projector (which can be recovered).
  • the virtual speckle image is equivalent to the function of a camera) for stereo correction, online calibration and other operations.
  • the traditional monocular laser speckle projection system needs to use an accurate ranging device to capture corresponding speckle images at different standard distances to further complete the external parameter calibration, and the monocular laser speckle in this application
  • the projection system can calculate the plane homography matrix of the image conversion relationship between the camera and the laser speckle projector through a calibration plate with logo features, so that the laser speckle projector can recover a virtual speckle image, allowing the user in this application to
  • the monocular laser speckle projection system has the same capability as binocular stereo vision.
  • this application compared with the traditional monocular laser speckle projection system, this application implements the external parameter calibration of the monocular laser speckle projection system through a calibration plate, without the need to use an accurate ranging device to measure the external parameters of the monocular laser speckle projection system in different standards.
  • the corresponding speckle image is taken at a distance. This significantly improves measurement efficiency and reduces measurement costs.
  • the monocular laser speckle projection system in this application is equivalent to a binocular camera system with speckle images, which improves measurement accuracy.
  • this application can also calibrate the optical center and optical axis position of the laser speckle projector, allowing users to online correct the optical axis deviation that occurs during use of the monocular laser speckle projection system.
  • a three-dimensional calibration piece composed of multiple planes can also be used to achieve the monocular laser speckle projection. Calibration of external parameters of the system.
  • the spatial plane equation of each plane can be calculated through the landmark features on the plane, and then by changing the position of the plane, the speckle image points with the same name can be found, and further Then calculate the three-dimensional coordinates of the speckle image point with the same name, thereby estimate the optical center and optical axis position of the laser speckle projector, determine the homography matrices of different planes of the camera and laser speckle projector, and finally generate based on these homography matrices Laser speckle projector virtual image.
  • the monocular laser speckle projection system which uses a stereoscopic calibration piece composed of multiple planes for external parameter calibration, has the same binocular stereoscopic vision capabilities as a binocular camera system, and can complete the calibration of external parameters conveniently and efficiently. It should be understood that other calibration methods using multi-plane implementation should also be included in the protection scope of this application.
  • sequence number of each step in the above embodiment does not mean the order of execution.
  • the execution order of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiment of the present application.
  • An embodiment of the present application provides an external parameter calibration device for a monocular laser speckle projection system.
  • the device includes:
  • the camera calibration image acquisition module is used to collect camera calibration images under a monocular laser speckle projection system.
  • the monocular laser speckle projection system includes a camera and a laser speckle projector;
  • Camera parameter calibration module used to calibrate camera parameters based on camera calibration images
  • a speckle image acquisition module is used to prepare a calibration plate and collect speckle images under a monocular laser speckle projection system, wherein the calibration plate is provided with at least three landmark features;
  • the spatial plane equation calculation module is used to extract the signature features of the speckle image according to the camera parameters and calculate the spatial plane equation of the calibration plate;
  • the speckle image point acquisition module with the same name is used to acquire the speckle image points with the same name based on the speckle image;
  • the speckle image point calculation module with the same name is used to calculate the three-dimensional coordinates of the speckle image point with the same name based on the spatial plane equation of the calibration plate;
  • the optical center and optical axis estimation module is used to estimate the optical center and optical axis position of the laser speckle projector based on the three-dimensional coordinates of the speckle image point with the same name;
  • the pose relationship calculation module is used to calculate the pose relationship between the camera and the laser speckle projector based on the preset laser speckle projector coordinate system;
  • the virtual speckle image generation module is used to calculate the plane homography matrix between the camera image and the laser speckle projector virtual image based on the pose relationship, and generate the laser speckle projector virtual image.
  • this application compared with the traditional monocular laser speckle projection system, this application implements the external parameter calibration of the monocular laser speckle projection system through a calibration plate, without the need to use an accurate ranging device to measure the external parameters of the monocular laser speckle projection system in different standards.
  • the corresponding speckle image is taken at a distance. This significantly improves measurement efficiency and reduces measurement costs.
  • the monocular laser speckle projection system in this application is equivalent to a binocular camera system with speckle images, which improves measurement accuracy.
  • this application can also calibrate the optical center and optical axis position of the laser speckle projector, allowing users to online correct the optical axis deviation that occurs during use of the monocular laser speckle projection system.
  • the present application provides a computer device, including a memory, a processor, and computer-readable instructions stored in the memory and executable on the processor.
  • the processor executes the computer-readable instructions, it performs the following steps: The steps of the external parameter calibration method of the monocular laser speckle projection system described in the example.
  • the present application provides a computer-readable storage medium.
  • the computer-readable storage medium stores computer-readable instructions.
  • the computer-readable instructions are executed by a processor, the monocular laser speckle projection system as described in the embodiment is implemented. Steps of external parameter calibration method.
  • Module completion means dividing the internal structure of the device into different functional units or modules to complete all or part of the functions described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种单目激光散斑投影系统的外参数标定方法和装置。该单目激光散斑投影系统的外参数标定方法利用标定板计算空间平面方程,并根据该空间平面方程计算同名散斑像点的三维坐标;然后根据所述同名散斑像点的三维坐标估算所述激光散斑投射器的光心和光轴位置;最后根据预设的激光散斑投射器坐标系,计算所述相机和所述激光散斑投射器之间的位姿关系,可利用该位姿关系生成相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,从而实现单目激光散斑投影系统的外参数标定。该单目激光散斑投影系统的外参数标定方法可显著提高外参数标定的测量效率,且可显著提高测量精度。

Description

单目激光散斑投影系统的外参数标定方法和装置 技术领域
本申请涉及计算机视觉领域,尤其涉及一种单目激光散斑投影系统的外参数标定方法和装置。
背景技术
高精度的深度测量是计算机视觉领域中重要的研究课题之一。传统的深度测量方法主要包括Time-of-Flight(简称ToF)和双目立体视觉方法。ToF通过测量调制光的飞行时间或者相位变换来获得目标的深度信息,具有受环境光影响较小、测量速度快、能实现远距离测量等优势。然而,ToF的测量精度只能达到厘米量级,这无法满足一些高精度测量任务的需求。
双目立体视觉方法通过匹配两个处于不同位置处相机拍摄得到的图像对来获取目标区域的视差图,从而得到待测目标的深度信息。该方法通常使用块匹配或半全局匹配算法来搜索图像对的相似区域,能够实现亚像素级别的匹配精度。由于双目立体视觉方法根据视觉特征来进行图像匹配,所以对于环境光变化明显或缺乏纹理特征的场景会出现匹配困难,导致匹配误差较大甚至匹配失败。此外,图像特征提取与匹配过程导致的庞大计算量限制了其在实时测量领域的应用。
为了克服上述ToF和双目立体视觉方法的缺陷,研究者们发明了基于激光散斑图像的投影系统。激光散斑投射器中的激光器射出红外激光,经过衍射光栅(毛玻璃),形成具有高度随机性的散斑图像。该系统采用红外相机拍摄待测目标表面的散斑图像,很大程度上降低了环境光对测量的影响。同时,基于散斑图像的随机性,激光散斑投影系统的测量时间能够缩短至单次曝光时间,从而实现实时动态测量。
根据散斑投影系统中的相机数量,其可以分为两类:双目散斑系统和单目散斑系统。双目激光散斑投影系统等效于带有散斑图像的双目立体视觉系统。具有高度随机性的散斑图像赋予了无纹理区域丰富的特征信息,这显著提高了双目立体视觉系统的图像匹配精度和测量精度。然而,双目散斑系统的制造成本较高,且系统标定步骤较为复杂。单目激光散斑投影系统仅包含一个红外相机和一个激光散斑投射器,其系统更加紧凑,成本较低。
由于激光散斑投射器不存在标准散斑图像,而且随着投射器与待测目标之间距离的增加,散斑图像会发生不同程度的变形,在出厂前或者返厂维修后,单目激光散斑投影系统需要借助高精度的测距仪器,在不同的标准距离处提前拍摄对应的散斑图像,这使得单目激光散斑投影系统的外参数标定变得复杂繁琐。
发明内容
有鉴于此,本申请实施例提供了一种单目激光散斑投影系统的外参数标定方法和装置,用以解决现有技术中单目激光散斑投影系统的外参数标定过于复杂繁琐的问题。
第一方面,本申请实施例提供了一种单目激光散斑投影系统的外参数标定方法,所述方法包括:
在单目激光散斑投影系统下采集相机标定图像,所述单目激光散斑投影系统包括相机和激光散斑投射器;
根据所述相机标定图像标定相机参数;
制有标定板,在所述单目激光散斑投影系统下采集散斑图像,其中,所述标定板设有至少三个标志特征;
根据所述相机参数提取所述散斑图像的所述标志特征,并计算所述标定板的空间平面方程;
根据所述散斑图像获取同名散斑像点;
根据所述标定板的空间平面方程,计算所述同名散斑像点的三维坐标;
根据所述同名散斑像点的三维坐标估算所述激光散斑投射器的光心和光轴位置;
根据预设的激光散斑投射器坐标系,计算所述相机和所述激光散斑投射器之间的位姿关系;
根据所述位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,生成所述激光散斑投射器虚拟图像。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述标志特征为对角标志,所述计算所述标定板的空间平面方程,包括:
在所述标定板上建立对应的世界坐标系,以左上方的所述对角标志为坐标系原点,x轴竖直向下经过左下方的所述对角标志,y轴水平向右经过右上方的所述对角标志,z轴垂直所述标定板向外,其中,所述对角标志在所述世界坐标系中的坐标表示为(X w,Y w,1),所述对角标志的像素坐标(u,v)与所述世界坐标系的三维坐标(X w,Y w,1)的关系表示为:
Figure PCTCN2022087970-appb-000001
其中,K c为所述相机的内参数矩阵,s为比例系数,R为旋转矩阵,T为平移向量;能够得到所述标定板表面的所述对角标志在相机坐标系中的坐标(X c,Y c,Z c):
Figure PCTCN2022087970-appb-000002
采用最小二乘法拟合得到所述标定板在所述相机坐标系中的空间平面方程。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述计算所述同名散斑像点的三维坐标,包括:
根据散斑像点坐标(u,v)和所述相机参数,计算出所述散斑像点在相机坐标系中的等效三维坐标(X c,Y c,Z c)为:
Figure PCTCN2022087970-appb-000003
其中(C x,C y)是所述相机的主点坐标,d x和d y分别是像元在x轴和y轴方向上的物理尺寸,f是相机焦距;
通过所述散斑像点与所述相机坐标系原点的空间直线方程可以表示为:
Figure PCTCN2022087970-appb-000004
根据所述空间直线方程与所述标定板的空间平面方程,计算所述同名散斑像点的三维坐标,得到所述同名散斑像点对应的三维坐标集。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述计算所述相机和所述激光散斑投射器之间的位姿关系,包括:
其中,激光散斑投射器光轴在相机坐标系中的归一化方向向量为V c=[v x,v y,v z] T,在所述激光散斑投射器坐标系中表示为V p=[0,0,1] T,两者的关系能够表示为:
Figure PCTCN2022087970-appb-000005
计算出A x和A y
Figure PCTCN2022087970-appb-000006
选取一个预设的A z值,计算得到旋转矩阵R,其中,A x、A y、A z为所述旋转矩阵R的欧拉角;
激光散斑投射器的光心在所述相机坐标系中的坐标为(x p,y p,z p),在所述激光散斑投射器坐标系中的坐标为(0,0,0),两者的关系能够表示为:
Figure PCTCN2022087970-appb-000007
得到平移向量T:
Figure PCTCN2022087970-appb-000008
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述根据所述位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,包括:
设有散斑像点在相机坐标系中的坐标为(x 1,y 1,z 1),在所述激光散斑投射器坐标系中为(x 2,y 2,z 2),两者的关系可以表示为:
Figure PCTCN2022087970-appb-000009
根据所述散斑像点位于所述标定板表面,得到所述散斑像点在所述相机坐标系中的坐标满足以下方程:
Figure PCTCN2022087970-appb-000010
其中,n为标定板的归一化法向量,d为所述相机坐标系原点到所述标定板的距离;
结合上述两式,得到:
Figure PCTCN2022087970-appb-000011
则所述相机图像和所述投射器虚拟图像之间的所述平面单应矩阵表示为:
Figure PCTCN2022087970-appb-000012
其中,K c和K p为所述相机和所述激光散斑投射器的内参数矩阵,且K p=K c
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,所述生成所述激光散斑投射器虚拟图像,包括:
根据所述相机图像和所述激光散斑投射器虚拟图像之间的所述平面单应矩阵,将所述相机图像映射到虚拟投射器平面得到所述激光散斑投射器虚拟图像。
如上所述的方面和任一可能的实现方式,进一步提供一种实现方式,在所述生成所述激光散斑投射器虚拟图像之后,将所述激光散斑投射器作为第二相机来使用,同所述单目激光散斑投影系统中的所述相机组成双目相机系统。
第二方面,本申请实施例提供了一种单目激光散斑投影系统的外参数标定装置,所述装置包括:
相机标定图像采集模块,用于在单目激光散斑投影系统下采集相机标定图像,所述单目激光散斑投影系统包括相机和激光散斑投射器;
相机参数标定模块,用于根据所述相机标定图像标定相机参数;
散斑图像采集模块,用于制有标定板,在所述单目激光散斑投影系统下采集散斑图像,其中,所述标定板设有至少三个标志特征;
空间平面方程计算模块,用于根据所述相机参数提取所述散斑图像的所述标志特征,并计算所述标定板的空间平面方程;
同名散斑像点获取模块,用于根据所述散斑图像获取同名散斑像点;
同名散斑像点计算模块,用于根据所述标定板的空间平面方程,计算所述同名散斑像点的三维坐标;
光心光轴估算模块,用于根据所述同名散斑像点的三维坐标估算所述激光散斑投射器的光心和光轴位置;
位姿关系计算模块,用于根据预设的激光散斑投射器坐标系,计算所述相机和所述激光散斑投射器之间的位姿关系;
虚拟散斑图像生成模块,用于根据所述位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,生成所述激光散斑投射器虚拟图像。
第三方面,本申请提供一种计算机设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机可读指令,所述处理器执行所述计算机可读指令时执行如第一方面所述单目激光散斑投影系统的外参数标定方法的步骤。
第四方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机可读指令,所述计算机可读指令被处理器执行时实现如第一方面任一项所述单目激光散斑投影系统的外参数标定方法的步骤。
在本申请实施例中,本申请利用标定板计算空间平面方程,并根据该空间平面方程计算同名散斑像点的三维坐标;然后根据同名散斑像点的三维坐标估算激光散斑投射器的光心和光轴位置;最后根据预设的激光散斑投射器坐标系,计算相机和激光散斑投射器之间的位姿关系,可利用该位姿关系生成相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,从而实现单目激光散斑投影系统的外参数标定。本申请不需要利用精确的测距装置在不同的标准距离处拍摄对应的散斑图像,可显著提高外参数标定的测量效率,且可显著提高测量精度。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其它的附图。
图1是本申请实施例中一种单目激光散斑投影系统的外参数标定方法的流程图;
图2是本申请实施例中一种标定板的示意图;
图3是本申请实施例中一种激光散斑投射器的光轴和光心的示意图。
具体实施方式
为了更好的理解本申请的技术方案,下面结合附图对本申请实施例进行详细描述。
应当明确,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
在本申请实施例中使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本申请。在本申请实施例和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。
应当理解,本文中使用的术语“和/或”仅仅是一种描述关联对象的相同的字段,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。
应当理解,尽管在本申请实施例中可能采用术语第一、第二、第三等来描述预设范围等,但这些预设范围不应限于这些术语。这些术语仅用来将预设范围彼此区分开。例如,在不脱离本申请实施例范围的情况下,第一预设范围也可以被称为第二预设范围,类似地,第二预设范围也可以被称为第一预设范围。
取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”或“响应于检测”。类似地,取决于语境,短语“如果确定”或“如果检测(陈述的条件或事件)”可以被解释成为“当确定时”或“响应于确定”或“当检测(陈述的条件或事件)时”或“响应于检测(陈述的条件或事件)”。
图1是本申请实施例中一种单目激光散斑投影系统的外参数标定方法的流程图。如图1所示,该单目激光散斑投影系统的外参数标定方法具体包括如下步骤:
S10:在单目激光散斑投影系统下采集相机标定图像,单目激光散斑投影系统包括相机和激光散斑投射器。
在一实施例中,将红外相机与激光散斑投射器以合适角度固定于三角架上,组成单目激光散斑投影系统。之后,选取大小合适的棋盘格标定板放置于相机视场范围内,打开相机调节焦距对棋盘格标定板图像进行拍摄。在此期间,需要不断调整棋盘格标定板的位置和姿态。
S20:根据相机标定图像标定相机参数。
在一实施例中,根据相机拍摄得到的棋盘格标定板图像集,采用张正友标定法计算相机的内参数,包括焦距和主点坐标,以及径向畸变和切向畸变。
S30:制有标定板,在单目激光散斑投影系统下采集散斑图像,其中,标定板设有至少三个标志特征。
其中,该标定板上包括至少三个标志特征,采用标志或特定特征的方式设在标定板平面上。本申请不对标志特征进行限定。
具体地,可以采用对角标志的方式在标定板上设有可用于计算标定板空间平面方程的标志特征。可以理解地,标定板上的标志特征可设置在三个及以上,且设置得越多(助于计算及验算),越有助于提高单目激光散斑投影系统外参标定的准确率,以提高单目激光散斑投影系统的测量精度。
在一实施例中,打印一张四角各有一对角标志,其余位置空白的对角标志标定纸,将对角标志标定纸紧贴于平板上,得到带有对角标志的标定板。将标定板放置于单目激光散斑系统的视场范围内,开启激光散斑投射器,向标定板表面投射散斑图像,并用相机记录下对应的散斑图像。在此期间,需要不断调整标定板的位置和姿态。
S40:根据相机参数提取散斑图像的标志特征,并计算标定板的空间平面方程。
在一实施例中,标定板中的标志特征具体可以是对角标志,即设置在标定板对角位置的标志。
在一实施例中,可根据S20中求解得到的相机畸变系数(包括径向畸变和切向畸变),矫正散斑图像的畸变。采用Harris角点检测算法识别散斑图像中的对角标志,并提取对角标志,从而根据对角标志计算标定板的空间平面方程。
进一步地,可在标定板上建立对应的世界坐标系,以左上方的对角标志为坐标系原点,x轴竖直向下经过左下方的对角标志,y轴水平向右经过右上方的对角标志,z轴垂直标定板向外。图2是本申请实施例中一种标定板的示意图,从图2可以看到,该标定板上四个角上各有一对角标志,根据该对角标志建立的世界坐标系如图2中各坐标轴所示。
可以理解地,由于对角标志之间的物理距离已知,所以对角标志在世界坐标系中的坐标已知,则对角标志在世界坐标系中的坐标表示为(X w,Y w,1),对角标志的像素坐标(u,v)与世界坐标系的三维坐标(X w,Y w,1)的关系表示为:
Figure PCTCN2022087970-appb-000013
其中,K c为相机的内参数矩阵,s为比例系数,R为旋转矩阵,T为平移向量,R和T描述了相机坐标系和世界坐标系之间的位姿关系;且由4组对角标志坐标对足够解算出R和T,则亦可得到标定板表面的对角标志在相机坐标系中的坐标(X c,Y c,Z c):
Figure PCTCN2022087970-appb-000014
最后,采用最小二乘法拟合得到标定板在相机坐标系中的空间平面方程。
S50:根据散斑图像获取同名散斑像点。
在一实施例中,可将第一幅散斑图像作为散斑匹配的参考图像,采用数字图像相关法,通过求解包含一阶和二阶位移梯度参数的位移形状函数来确定散斑像点的最佳匹配位置,从而获得散斑图像的同名散斑像点集。
S60:根据标定板的空间平面方程,计算同名散斑像点的三维坐标。
可以理解地,在无失真相机成像模型中,进入相机的光线必然通过相机的光心,即相机坐标系的原点,在获取标定板的空间平面方程,可根据该空间平面方程计算同名散斑像点的三维坐标。可以理解地,该同名散斑像点为标定板平面上的三维坐标,该三维坐标是激光散斑投射器发出的散斑射线同标定板平面的交点。
具体地,根据散斑像点坐标(u,v)和相机参数,计算出散斑像点在相机坐标系中的等效三维坐标(X c,Y c,Z c)为:
Figure PCTCN2022087970-appb-000015
其中(C x,C y)是相机的主点坐标,d x和d y分别是像元在x轴和y轴方向上的物理尺寸,f是相机焦距;通过散斑像点与相机坐标系原点的空间直线方程可以表示为:
Figure PCTCN2022087970-appb-000016
根据空间直线方程与标定板的空间平面方程,计算同名散斑像点的三维坐标,得到同名散斑像点对应的三维坐标集。
S70:根据同名散斑像点的三维坐标估算激光散斑投射器的光心和光轴位置。
在一实施例中,在激光散斑投射器的制造过程中,激光散斑投射器光轴被设计严格垂直于衍射光栅,并经过其中心位置。因此,在不同位置和姿态的标定板上的散斑图像中心点均处在激光散斑投射器的光轴或附近区域。根据步骤S60得到的散斑图像中心点对应的三维坐标集,通过直线拟合即可确定激光散斑投射器的光轴位置。此外,由同名散斑点拟合而成的直线对应着激光散斑投射器出射的光线,其中,通过激光器的光源点即激光散斑投射器的光心。可以理解地,在相机标定误差、图像匹配误差、拟合误差等因素影响下,拟合直线集不会相交于一点,而是会出现不同程度的偏移。因此,计算距离拟合直线集最近的空间点,将其作为最优的投射器光心。
图3是本申请实施例中一种激光散斑投射器的光轴和光心的示意图。从图3中可以看到标定板上同名散斑点、光轴及(激光散斑)投射器之间的物理空间关系。
S80:根据预设的激光散斑投射器坐标系,计算相机和激光散斑投射器之间的位姿关系。
在一实施例中,可根据得到的激光散斑投射器的光心和光轴位置建立激光散斑投射器坐标系,并在该激光散斑投射器坐标系计算相机和激光散斑投射器之间的位姿关系。
具体地,激光散斑投射器坐标系以激光散斑投射器光心为原点,z轴与激光散斑投射器光轴重合,以面向目标方向为正方向。激光散斑投射器光轴在相机坐标系中的归一化方向向量为V c=[v x,v y,v z] T,在激光散斑投射器坐标系中表示为V p=[0,0,1] T,两者的关系能够表示为:
Figure PCTCN2022087970-appb-000017
其中A x、A y、A z为旋转矩阵的欧拉角,需要说明的是,欧拉角的旋转顺序可以采用x-y-z的旋转顺序,采用其他旋转顺序的欧拉角也是可以的。
上式可以化简为:
Figure PCTCN2022087970-appb-000018
可计算出A x和A y
Figure PCTCN2022087970-appb-000019
A z决定了激光散斑投射器坐标系x和y轴的方向,以及散斑图像的散斑坐标。由于激光散斑投射器的光心和光轴位置已被确定,所以激光散斑投射器中散斑图像的绝对物理位置是固定 的,与欧拉角无关。在保证绝大部分或整个散斑图像能够处在激光散斑投射器虚拟图像的情况下,选取一个合适的A z值,并计算得到旋转矩阵R。
激光散斑投射器的光心在相机坐标系中的坐标为(x p,y p,z p),在激光散斑投射器坐标系中的坐标为(0,0,0),两者的关系能够表示为:
Figure PCTCN2022087970-appb-000020
则可得到平移向量:
Figure PCTCN2022087970-appb-000021
S90:根据位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,生成激光散斑投射器虚拟图像。
具体地,设有散斑像点在相机坐标系中的坐标为(x 1,y 1,z 1),在激光散斑投射器坐标系中为(x 2,y 2,z 2),两者的关系可以表示为:
Figure PCTCN2022087970-appb-000022
其中R和T分别是S80计算得到的旋转矩阵和平移向量,它们描述了相机坐标系和激光散斑投射器坐标系之间的位姿关系。由于散斑像点位于标定板表面,因此散斑像点在相机坐标系中的坐标满足以下方程:
Figure PCTCN2022087970-appb-000023
其中,n为标定板的归一化法向量,d为相机坐标系原点到标定板的距离;
结合上述两式,得到:
Figure PCTCN2022087970-appb-000024
则相机图像和投射器虚拟图像之间的平面单应矩阵表示为:
Figure PCTCN2022087970-appb-000025
其中,K c和K p为相机和激光散斑投射器的内参数矩阵,且K p=K c。由于散斑图像是激光通过衍射光栅生成的,不经过透镜组,因此不考虑投射器的畸变。
在具体实现时,在待测目标布置标定板,激光散斑投射器投影散斑图像至标定板表面,相机拍摄对应的散斑图像。通过计算标定板的空间平面方程,从而可得到平面单应矩阵H。可以理解地,有了平面单应矩阵H,可以将相机拍摄的图像转换到激光散斑投射器,使得本申请的单目激光散斑投影系统具有等同于双目相机系统的双目立体视觉能力,可便捷、高效地完成外参数的标定。
进一步地,在步骤S90中,即生成激光散斑投射器虚拟图像的步骤中,具体还包括:根据 相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,将相机图像映射到虚拟投射器平面得到激光散斑投射器虚拟图像。
在一实施例中,平面单应矩阵描述了同一平面的点在不同图像之间的映射关系,该平面可以是标定板所在平面或者待测目标对应图像所在的平面。可以理解地,如在采用双目相机进行拍摄时,该两相机将得到平面上不同拍摄角度的图像,而平面单应矩阵描述了该不同拍摄角度图像间的映射关系。结合到本申请的方案,本申请借助于空间平面建立起相机和激光散斑投射器的联系,通过平面单应矩阵恢复激光散斑投射器对应的虚拟散斑图像。进一步地,在恢复出虚拟散斑图像后,本申请中的单目激光散斑投影系统等同于双目相机系统,可以采用双目立体视觉的方式对相机和激光散斑投射器(可恢复得到虚拟散斑图像,等同于相机的作用)进行立体矫正,在线标定等操作。
可以理解地,传统的单目激光散斑投影系统需要利用精确的测距装置在不同的标准距离处拍摄对应的散斑图像,从而进一步完成外参标定,而本申请中的单目激光散斑投影系统,可通过带有标志特征的标定板计算出相机和激光散斑投射器关于图像转换关系的平面单应矩阵,使得激光散斑投射器可恢复出虚拟散斑图像,让本申请中的单目激光散斑投影系统具有等同于双目立体视觉的能力,可借助双视图约束的方式提高图像匹配速度,从而提高测量精度,并降低标定的复杂性;进一步地,如果设备长时间使用导致测量精度下降,用户可重新对该单目激光散斑系统进行外参数的快速标定,无需返厂重新进行标定。
在本申请实施例中,与传统的单目激光散斑投影系统相比,本申请通过标定板实现单目激光散斑投影系统的外参数标定,不需要利用精确的测距装置在不同的标准距离处拍摄对应的散斑图像。这显著提高了测量效率,并降低了测量成本。本申请中的单目激光散斑投影系统等效于带有散斑图像的双目相机系统,提高了测量精度。此外本申请还能够标定出激光散斑投射器的光心和光轴位置,使得用户能够在线修正单目激光散斑投影系统在使用过程中出现的光轴偏移。
需要说明的是,除了本申请中采用的利用单个平面标定板实现单目激光散斑投影系统的外参数标定外,也可以是采用由多个平面组成的立体标定件实现单目激光散斑投影系统的外参数标定。具体地,参照本申请,对于包括多个平面的立体标定件,可通过平面上的标志特征计算得到每个平面的空间平面方程,然后通过变动平面的位置,找出同名散斑像点,进一步再计算得到同名散斑像点的三维坐标,从而估算得到激光散斑投射器的光心和光轴位置,确定相机和激光散斑投射器不同平面的单应矩阵,最后根据这些单应矩阵可生成激光散斑投射器虚拟图像。该采用由多个平面组成的立体标定件进行外参数标定的单目激光散斑投影系统具有等同于双目相机系统的双目立体视觉能力,可便捷、高效地完成外参数的标定。应理解地,对于其他采用多平面实现的标定方式,也应包含在本申请的保护范围之内。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本申请实施例提供一种单目激光散斑投影系统的外参数标定装置。该装置包括:
相机标定图像采集模块,用于在单目激光散斑投影系统下采集相机标定图像,单目激光散斑投影系统包括相机和激光散斑投射器;
相机参数标定模块,用于根据相机标定图像标定相机参数;
散斑图像采集模块,用于制有标定板,在单目激光散斑投影系统下采集散斑图像,其中,标定板设有至少三个标志特征;
空间平面方程计算模块,用于根据相机参数提取散斑图像的标志特征,并计算标定板的空间平面方程;
同名散斑像点获取模块,用于根据散斑图像获取同名散斑像点;
同名散斑像点计算模块,用于根据标定板的空间平面方程,计算同名散斑像点的三维坐标;
光心光轴估算模块,用于根据同名散斑像点的三维坐标估算激光散斑投射器的光心和光轴位置;
位姿关系计算模块,用于根据预设的激光散斑投射器坐标系,计算相机和激光散斑投射器之间的位姿关系;
虚拟散斑图像生成模块,用于根据位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,生成激光散斑投射器虚拟图像。
在本申请实施例中,与传统的单目激光散斑投影系统相比,本申请通过标定板实现单目激光散斑投影系统的外参数标定,不需要利用精确的测距装置在不同的标准距离处拍摄对应的散斑图像。这显著提高了测量效率,并降低了测量成本。本申请中的单目激光散斑投影系统等效于带有散斑图像的双目相机系统,提高了测量精度。此外本申请还能够标定出激光散斑投射器的光心和光轴位置,使得用户能够在线修正单目激光散斑投影系统在使用过程中出现的光轴偏移。
本申请提供一种计算机设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机可读指令,所述处理器执行所述计算机可读指令时执行如实施例所述单目激光散斑投影系统的外参数标定方法的步骤。
本申请提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机可读指令,所述计算机可读指令被处理器执行时实现如实施例所述单目激光散斑投影系统的外参数标定方法的步骤。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。
以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (10)

  1. 一种单目激光散斑投影系统的外参数标定方法,其特征在于,包括:
    在单目激光散斑投影系统下采集相机标定图像,所述单目激光散斑投影系统包括相机和激光散斑投射器;
    根据所述相机标定图像标定相机参数;
    制有标定板,在所述单目激光散斑投影系统下采集散斑图像,其中,所述标定板设有至少三个标志特征;
    根据所述相机参数提取所述散斑图像的所述标志特征,并计算所述标定板的空间平面方程;
    根据所述散斑图像获取同名散斑像点;
    根据所述标定板的空间平面方程,计算所述同名散斑像点的三维坐标;
    根据所述同名散斑像点的三维坐标估算所述激光散斑投射器的光心和光轴位置;
    根据预设的激光散斑投射器坐标系,计算所述相机和所述激光散斑投射器之间的位姿关系;
    根据所述位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,生成所述激光散斑投射器虚拟图像。
  2. 根据权利要求1所述的方法,其特征在于,所述标志特征为对角标志,所述计算所述标定板的空间平面方程,包括:
    在所述标定板上建立对应的世界坐标系,以左上方的所述对角标志为坐标系原点,x轴竖直向下经过左下方的所述对角标志,y轴水平向右经过右上方的所述对角标志,z轴垂直所述标定板向外,其中,所述对角标志在所述世界坐标系中的坐标表示为(X w,Y w,1),所述对角标志的像素坐标(u,v)与所述世界坐标系的三维坐标(X w,Y w,1)的关系表示为:
    Figure PCTCN2022087970-appb-100001
    其中,K c为所述相机的内参数矩阵,s为比例系数,R为旋转矩阵,T为平移向量;能够得到所述标定板表面的所述对角标志在相机坐标系中的坐标(X c,Y c,Z c):
    Figure PCTCN2022087970-appb-100002
    采用最小二乘法拟合得到所述标定板在所述相机坐标系中的空间平面方程。
  3. 根据权利要求1所述的方法,其特征在于,所述计算所述同名散斑像点的三维坐标,包括:
    根据散斑像点坐标(u,v)和所述相机参数,计算出所述散斑像点在相机坐标系中的等效三维坐标(X c,Y c,Z c)为:
    Figure PCTCN2022087970-appb-100003
    其中(C x,C y)是所述相机的主点坐标,d x和d y分别是像元在x轴和y轴方向上的物理尺寸,f是相机焦距;
    通过所述散斑像点与所述相机坐标系原点的空间直线方程可以表示为:
    Figure PCTCN2022087970-appb-100004
    根据所述空间直线方程与所述标定板的空间平面方程,计算所述同名散斑像点的三维坐标,得到所述同名散斑像点对应的三维坐标集。
  4. 根据权利要求1所述的方法,其特征在于,所述计算所述相机和所述激光散斑投射器之间的位姿关系,包括:
    激光散斑投射器光轴在相机坐标系中的归一化方向向量为V c=[v x,v y,v z] T,在所述激光散斑投射器坐标系中表示为V p=[0,0,1] T,两者的关系能够表示为:
    Figure PCTCN2022087970-appb-100005
    计算出A x和A y
    Figure PCTCN2022087970-appb-100006
    选取一个预设的A z值,计算得到旋转矩阵R,其中,A x、A y、A z为所述旋转矩阵R的欧拉角;
    激光散斑投射器的光心在所述相机坐标系中的坐标为(x p,y p,z p),在所述激光散斑投射器坐标系中的坐标为(0,0,0),两者的关系能够表示为:
    Figure PCTCN2022087970-appb-100007
    得到平移向量T:
    Figure PCTCN2022087970-appb-100008
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,包括:
    设有散斑像点在相机坐标系中的坐标为(x 1,y 1,z 1),在所述激光散斑投射器坐标系中为(x 2,y 2,z 2),两者的关系可以表示为:
    Figure PCTCN2022087970-appb-100009
    根据所述散斑像点位于所述标定板表面,得到所述散斑像点在所述相机坐标系中的坐标满足以下方程:
    Figure PCTCN2022087970-appb-100010
    其中,n为标定板的归一化法向量,d为所述相机坐标系原点到所述标定板的距离;
    结合上述两式,得到:
    Figure PCTCN2022087970-appb-100011
    则所述相机图像和所述投射器虚拟图像之间的所述平面单应矩阵表示为:
    Figure PCTCN2022087970-appb-100012
    其中,K c和K p为所述相机和所述激光散斑投射器的内参数矩阵,且K p=K c
  6. 根据权利要求1所述的方法,其特征在于,所述生成所述激光散斑投射器虚拟图像,包括:
    根据所述相机图像和所述激光散斑投射器虚拟图像之间的所述平面单应矩阵,将所述相机图像映射到虚拟投射器平面得到所述激光散斑投射器虚拟图像。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,在所述生成所述激光散斑投射器虚拟图像之后,将所述激光散斑投射器作为第二相机来使用,同所述单目激光散斑投影系统中的所述相机组成双目相机系统。
  8. 一种单目激光散斑投影系统的外参数标定装置,其特征在于,包括:
    相机标定图像采集模块,用于在单目激光散斑投影系统下采集相机标定图像,所述单目激光散斑投影系统包括相机和激光散斑投射器;
    相机参数标定模块,用于根据所述相机标定图像标定相机参数;
    散斑图像采集模块,用于制有标定板,在所述单目激光散斑投影系统下采集散斑图像,其中,所述标定板设有至少三个标志特征;
    空间平面方程计算模块,用于根据所述相机参数提取所述散斑图像的所述标志特征,并计算所述标定板的空间平面方程;
    同名散斑像点获取模块,用于根据所述散斑图像获取同名散斑像点;
    同名散斑像点计算模块,用于根据所述标定板的空间平面方程,计算所述同名散斑像点的三维坐标;
    光心光轴估算模块,用于根据所述同名散斑像点的三维坐标估算所述激光散斑投射器的光心和光轴位置;
    位姿关系计算模块,用于根据预设的激光散斑投射器坐标系,计算所述相机和所述激光散斑投射器之间的位姿关系;
    虚拟散斑图像生成模块,用于根据所述位姿关系计算相机图像和激光散斑投射器虚拟图像之间的平面单应矩阵,生成所述激光散斑投射器虚拟图像。
  9. 一种计算机设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机可读指令,其特征在于,所述处理器执行所述计算机可读指令时执行如权利要求1-7任一项所述单目激光散斑投影系统的外参数标定方法的步骤。
  10. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机可读指令,其特征在于,所述计算机可读指令被处理器执行时实现如权利要求1至7任一项所述单目激光散斑投影系统的外参数标定方法的步骤。
PCT/CN2022/087970 2022-04-20 2022-04-20 单目激光散斑投影系统的外参数标定方法和装置 WO2023201578A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/087970 WO2023201578A1 (zh) 2022-04-20 2022-04-20 单目激光散斑投影系统的外参数标定方法和装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/087970 WO2023201578A1 (zh) 2022-04-20 2022-04-20 单目激光散斑投影系统的外参数标定方法和装置

Publications (1)

Publication Number Publication Date
WO2023201578A1 true WO2023201578A1 (zh) 2023-10-26

Family

ID=88418676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/087970 WO2023201578A1 (zh) 2022-04-20 2022-04-20 单目激光散斑投影系统的外参数标定方法和装置

Country Status (1)

Country Link
WO (1) WO2023201578A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117146711A (zh) * 2023-10-30 2023-12-01 中国科学院自动化研究所 基于双振镜系统的大范围动态激光重建方法、系统及设备
CN117705004A (zh) * 2023-12-27 2024-03-15 福建省高速公路科技创新研究院有限公司 填筑土层压实高精度测量及标定方法
CN117718985A (zh) * 2024-02-07 2024-03-19 西安中科光电精密工程有限公司 一种基于智能三维视觉的搜排爆机器人
CN118096894A (zh) * 2024-02-06 2024-05-28 宽瑞智能科技(苏州)有限公司 面向手术机器人的单相机标定方法和装置
CN118379366A (zh) * 2024-06-24 2024-07-23 安徽炬视科技有限公司 基于几何先验的单画面相机外参自标定方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170017858A1 (en) * 2015-07-15 2017-01-19 Samsung Electronics Co., Ltd. Laser speckle contrast imaging system, laser speckle contrast imaging method, and apparatus including the laser speckle contrast imaging system
CN111243002A (zh) * 2020-01-15 2020-06-05 中国人民解放军国防科技大学 应用于高精度三维测量的单目激光散斑投影系统标定及深度估计方法
CN111487043A (zh) * 2020-05-07 2020-08-04 北京的卢深视科技有限公司 单目散斑结构光系统的散斑投射器标定参数确定方法
CN111896032A (zh) * 2020-09-29 2020-11-06 北京清微智能科技有限公司 一种单目散斑投射器位置的标定系统及方法
CN113793387A (zh) * 2021-08-06 2021-12-14 中国科学院深圳先进技术研究院 单目散斑结构光系统的标定方法、装置及终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170017858A1 (en) * 2015-07-15 2017-01-19 Samsung Electronics Co., Ltd. Laser speckle contrast imaging system, laser speckle contrast imaging method, and apparatus including the laser speckle contrast imaging system
CN111243002A (zh) * 2020-01-15 2020-06-05 中国人民解放军国防科技大学 应用于高精度三维测量的单目激光散斑投影系统标定及深度估计方法
CN111487043A (zh) * 2020-05-07 2020-08-04 北京的卢深视科技有限公司 单目散斑结构光系统的散斑投射器标定参数确定方法
CN111896032A (zh) * 2020-09-29 2020-11-06 北京清微智能科技有限公司 一种单目散斑投射器位置的标定系统及方法
CN113793387A (zh) * 2021-08-06 2021-12-14 中国科学院深圳先进技术研究院 单目散斑结构光系统的标定方法、装置及终端

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117146711A (zh) * 2023-10-30 2023-12-01 中国科学院自动化研究所 基于双振镜系统的大范围动态激光重建方法、系统及设备
CN117146711B (zh) * 2023-10-30 2024-02-13 中国科学院自动化研究所 基于双振镜系统的大范围动态激光重建方法、系统及设备
CN117705004A (zh) * 2023-12-27 2024-03-15 福建省高速公路科技创新研究院有限公司 填筑土层压实高精度测量及标定方法
CN117705004B (zh) * 2023-12-27 2024-05-28 福建省高速公路科技创新研究院有限公司 填筑土层压实高精度测量及标定方法
CN118096894A (zh) * 2024-02-06 2024-05-28 宽瑞智能科技(苏州)有限公司 面向手术机器人的单相机标定方法和装置
CN117718985A (zh) * 2024-02-07 2024-03-19 西安中科光电精密工程有限公司 一种基于智能三维视觉的搜排爆机器人
CN118379366A (zh) * 2024-06-24 2024-07-23 安徽炬视科技有限公司 基于几何先验的单画面相机外参自标定方法及系统

Similar Documents

Publication Publication Date Title
WO2023201578A1 (zh) 单目激光散斑投影系统的外参数标定方法和装置
CN111243002A (zh) 应用于高精度三维测量的单目激光散斑投影系统标定及深度估计方法
CN110057295B (zh) 一种免像控的单目视觉平面距离测量方法
CN108594245A (zh) 一种目标物运动监测系统及方法
CN111210468A (zh) 一种图像深度信息获取方法及装置
CN109272555B (zh) 一种rgb-d相机的外部参数获得及标定方法
CN106548489A (zh) 一种深度图像与彩色图像的配准方法、三维图像采集装置
CN109827521B (zh) 一种快速多线结构光视觉测量系统标定方法
CN109163657A (zh) 一种基于双目视觉三维重建的圆形目标位姿检测方法
CN112229323B (zh) 基于手机单目视觉的棋盘格合作目标的六自由度测量方法及其应用
CN114714356A (zh) 基于双目视觉的工业机器人手眼标定误差精确检测方法
CN111192235A (zh) 一种基于单目视觉模型和透视变换的图像测量方法
KR101926953B1 (ko) 4카메라 그룹 평면 어레이의 특징점의 매칭 방법 및 그에 기초한 측정 방법
CN110779491A (zh) 一种水平面上目标测距的方法、装置、设备及存储介质
CN114926538A (zh) 单目激光散斑投影系统的外参数标定方法和装置
KR101597163B1 (ko) 스테레오 카메라 교정 방법 및 장치
CN108180888A (zh) 一种基于可转动摄像头的距离检测方法
CN114299156A (zh) 无重叠区域下多相机的标定与坐标统一方法
CN112184811A (zh) 单目空间结构光系统结构校准方法及装置
CN113822920B (zh) 结构光相机获取深度信息的方法、电子设备及存储介质
CN208350997U (zh) 一种目标物运动监测系统
Chatterjee et al. A nonlinear Gauss–Seidel algorithm for noncoplanar and coplanar camera calibration with convergence analysis
WO2019087253A1 (ja) ステレオカメラのキャリブレーション方法
CN115375773A (zh) 单目激光散斑投影系统的外参数标定方法和相关装置
CN112419427A (zh) 用于提高飞行时间相机精度的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22937815

Country of ref document: EP

Kind code of ref document: A1