CN112541950A - Method and device for calibrating external parameter of depth camera - Google Patents

Method and device for calibrating external parameter of depth camera Download PDF

Info

Publication number
CN112541950A
CN112541950A CN201910892567.7A CN201910892567A CN112541950A CN 112541950 A CN112541950 A CN 112541950A CN 201910892567 A CN201910892567 A CN 201910892567A CN 112541950 A CN112541950 A CN 112541950A
Authority
CN
China
Prior art keywords
plane
coordinate system
camera
calibration
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910892567.7A
Other languages
Chinese (zh)
Inventor
李建禹
龙学雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikrobot Technology Co Ltd
Original Assignee
Hangzhou Hikrobot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikrobot Technology Co Ltd filed Critical Hangzhou Hikrobot Technology Co Ltd
Priority to CN201910892567.7A priority Critical patent/CN112541950A/en
Publication of CN112541950A publication Critical patent/CN112541950A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a method for calibrating external parameters of a depth camera, which is characterized by comprising the steps of obtaining depth image data, wherein the depth image data comprises pixel point coordinates and depth values; converting pixel points in the depth image into space three-dimensional points under a camera coordinate system based on the acquired depth image data; acquiring a fitting plane of the three-dimensional points based on the three-dimensional points; obtaining parameters between the depth camera and the calibration plane according to the current pose relation between the calibration plane parallel to or coincident with the fitting plane and the camera coordinate system; the calibration plane comprises any current plane parallel to or perpendicular to the bearing surface of the mobile robot body where the depth camera is located. According to the method and the device, a calibration result can be given only by acquiring one image without any external tool in the calibration process, calibration can be performed by means of natural horizontal and vertical ground and wall surfaces, and an application basis with good robustness and strong real-time performance is provided for application based on depth image data.

Description

Method and device for calibrating external parameter of depth camera
Technical Field
The invention relates to the field of machine vision, in particular to a method for calibrating external parameters of a depth camera.
Background
In image measurement processes and machine vision applications, in order to determine the correlation between the three-dimensional geometric position of a certain point on the surface of an object in space and the corresponding point in the image, a geometric model of camera imaging must be established, and the parameters of the geometric model are the parameters of the camera. Camera calibration is a process of calculating camera parameters by a certain method.
During image pickup, the pose of the camera is not fixed, which results in that the camera coordinate system is not a stable coordinate system, and the origin of the camera coordinate system and the directions of the coordinate axes are changed along with the movement of the camera, which requires introducing a stable and unchangeable coordinate system: a world coordinate system, which is an absolute coordinate system. The transformation from the camera coordinate system to the world coordinate system requires rotational and translational transformations that become external parameters for the camera.
The depth camera is a camera capable of directly obtaining depth image data, and any pixel point data in the obtained depth image data is data of a distance value (depth value) representing a space point by a pixel point gray value based on pixel point coordinates, and can be represented as p (u, v, d), where u and v are coordinates of the pixel point in an image coordinate system, and d is a depth value of a space three-dimensional point corresponding to the pixel point.
According to the constituent principle of the camera, the depth camera includes a passive binocular camera, an active binocular camera, a time of flight (TOF) camera, a monocular structured light, and the like. At present, external reference calibration of a depth camera is generally performed on external reference inside the depth camera, for example, a depth camera formed by a binocular camera is calibrated on relative rotation and translation between the two cameras, and the calibration method is not suitable for calibrating the external reference relation between the depth camera and an external coordinate system.
Disclosure of Invention
The invention provides a method for calibrating external parameters of a depth camera, which is used for calibrating a transformation relation between a camera coordinate system and a calibration plane based on depth image data.
The invention provides a method for calibrating external parameters of a depth camera, which comprises the following steps,
acquiring depth image data, wherein the depth image data comprises pixel point coordinates and depth values;
converting pixel points in the depth image into space three-dimensional points under a camera coordinate system based on the acquired depth image data;
acquiring a fitting plane of the three-dimensional points based on the three-dimensional points;
obtaining parameters between the depth camera and the calibration plane according to the current pose relation between the calibration plane parallel to or coincident with the fitting plane and the camera coordinate system;
the calibration plane comprises any current plane parallel to or perpendicular to the bearing surface of the mobile robot body where the depth camera is located.
Preferably, the depth image data is at least one depth image, and the converting pixel points in the depth image into spatial three-dimensional points in a camera coordinate system includes,
and obtaining the three-dimensional point coordinates corresponding to the pixel points according to the mapping geometric model of the space three-dimensional point under the camera coordinate system and the two-dimensional image under the image coordinate system.
Preferably, the obtaining of the three-dimensional point coordinates corresponding to the pixel points according to the mapping geometric model of the three-dimensional point in space under the camera coordinate system and the two-dimensional image under the image coordinate system includes,
for any pixel point in the depth image,
taking the depth value of the pixel point as the z coordinate value of the three-dimensional point;
acquiring a first difference value between an x coordinate of a pixel point and x direction offset, and multiplying the ratio of a z coordinate value to a focal length in the x direction in camera internal parameters by the first difference value to obtain a result as the x coordinate value of the three-dimensional point; the x-direction deviation is the deviation of the origin of the camera coordinate system relative to the image coordinate system in the x direction;
acquiring a second difference value between the y coordinate of the pixel point and the y direction offset, and multiplying the ratio of the z coordinate value to the y direction focal length in the camera internal parameter by the second difference value to obtain a result as the y coordinate value of the three-dimensional point; the y-direction deviation is the deviation of the origin of the camera coordinate system relative to the image coordinate system in the y direction;
the camera coordinate system origin is deviated relative to the x direction in the image coordinate system, and the camera coordinate system origin is deviated relative to the y direction in the image coordinate system according to camera internal parameters;
and converting all pixel points in the depth image into three-dimensional point coordinates to obtain a three-dimensional point set.
Preferably, the method further comprises screening the three-dimensional points in the three-dimensional point set according to a screening strategy to obtain a screened three-dimensional point set; wherein the screening strategy comprises any one or any combination of the following conditions:
(1) removing three-dimensional points in a certain range above the depth image according to the orientation of the camera;
(2) under the condition that an external reference initial estimation value exists, three-dimensional points under a camera coordinate system are converted to a world coordinate system through the external reference of a camera, and three-dimensional points with height larger than a preset height threshold value in the height direction are removed;
(3) for a binocular stereoscopic vision depth camera, removing three-dimensional points with depth values larger than a preset depth threshold;
(4) and for the time of flight TOF depth camera, removing the three-dimensional points of which the distance difference is greater than a preset distance threshold according to the distance difference between the pixel point and the adjacent pixel point.
Preferably, said obtaining a fitting plane of three-dimensional points based on said three-dimensional points comprises,
and acquiring a fitting plane equation of the three-dimensional points by utilizing a random sample consensus (RANSAC) algorithm according to the screened three-dimensional point set, wherein the number of the three-dimensional points is more than or equal to 3.
Preferably, the obtaining of the fitted plane equation of the three-dimensional points by using a random sample consensus RANSAC algorithm includes,
based on the screened three-dimensional point set, carrying out random selection to obtain a current subset formed by the randomly selected three-dimensional points, wherein the number of the three-dimensional points in the subset at least comprises 3;
obtaining a fitting plane estimate of the subset based on the three-dimensional points in the current subset;
according to the obtained fitting plane estimation, the coincidence degree of all three-dimensional points in the screened three-dimensional point set relative to the fitting plane estimation is obtained;
if the coincidence degree is not enough, returning to the step of randomly selecting based on the screened three-dimensional point set;
if the coincidence degree is reached, solving a fitting plane equation by using the inner point of the fitting plane estimation with the best coincidence degree;
the interior points comprise three-dimensional points, wherein the distance from the three-dimensional points in the screened three-dimensional point set to the fitting plane with the best conformity degree is smaller than a preset distance threshold value.
Preferably, said obtaining an estimate of a fitted plane for the current subset based on three-dimensional points in the subset comprises,
if the number of the three-dimensional points in the current subset is equal to 3, substituting the coordinate values of the three-dimensional points into a fitting plane equation, and solving the unknown number in the fitting plane equation to obtain the fitting plane estimation of the current subset;
and if the number of the three-dimensional points in the current subset is more than 3, substituting the coordinate values of the three-dimensional points into a fitting plane equation, and solving the unknown number in the fitting plane equation by a least square method to obtain the fitting plane estimation of the current subset.
Preferably, the obtaining of the coincidence degree of all three-dimensional points in the screened three-dimensional point set with respect to the fitting plane estimation according to the obtained fitting plane estimation includes,
calculating an estimated distance value from each three-dimensional point in the screened three-dimensional point set to the fitting plane,
taking the three-dimensional point with the calculated distance value smaller than the set distance threshold value as an inner point,
counting the number of the inner points,
calculating the proportion of the number of the inner points to the screened three-dimensional point cloud to obtain the inner point rate;
determining the coincidence degree according to the interior point rate;
solving the fitting plane equation by using the inner points of the fitting plane estimation with the best conformity degree comprises taking the fitting plane estimation with the highest inner point rate as the best fitting plane estimation, and solving the unknowns in the fitting plane equation again by a least square method according to the inner points of the best fitting plane estimation to obtain the fitting plane equation.
Preferably, the determining the degree of coincidence according to the inlier rate includes,
determining the coincidence degree according to the iteration times, wherein the iteration times meet the following conditions:
Figure BDA0002209215940000041
wherein m is the number of three-dimensional points in the subset, η is a set confidence coefficient, and epsilon is an interior point rate which is a proportion of interior points under the worst condition, or the interior point rate is set under the worst condition in an initial state and is updated to the current maximum interior point rate along with the iteration times.
Preferably, the determining the coincidence degree according to the inlier ratio includes determining the coincidence degree according to whether the probabilities that all the v subsets are inliers satisfy a set confidence level, where the probabilities that all the v subsets are inliers are:
Figure BDA0002209215940000042
where λ is the expectation of the selected number of times that the subset is all inliers in the current iteration.
Preferably, the obtaining of the parameters between the depth camera and the calibration plane according to the current pose relationship between the calibration plane parallel to or coincident with the fitting plane and the camera coordinate system includes,
obtaining a fitting plane normal vector according to the fitting plane equation,
obtaining an equation of a calibration plane parallel to or coincident with the fitting plane according to the fitting plane normal vector;
calculating the distance between the origin of the camera coordinate system and the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane;
calculating the rotation of the camera coordinate system relative to the front view of the calibration plane to the rotation transformation of the camera coordinate system relative to the calibration plane,
and taking the distance transformation and the rotation transformation as parameters between the depth camera and a calibration plane.
Preferably, the depth image data includes image data of a bearing surface of the mobile robot body,
obtaining an equation of a calibration plane associated with the fitting plane according to the fitting plane normal vector, wherein the equation comprises;
obtaining an equation of a calibration plane which is parallel to the fitting plane and has a set distance according to the fitting plane normal vector; the calibration plane is the bearing ground of the robot body under the world coordinate system;
calculating the distance between the origin of the camera coordinate system and the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane, wherein the distance transformation comprises the following steps;
calculating the distance between the origin of the camera coordinate system and the calibration plane according to the equation parameters of the calibration plane to obtain the height transformation of the camera coordinate system relative to the calibration plane;
the calculation of the rotation of the camera coordinate system with respect to the front view of the calibration plane to the rotational transformation of the camera coordinate system with respect to the calibration plane comprises,
and according to the equation parameters of the calibration plane, calculating the rotation of the y axis of the camera coordinate system relative to the normal vector of the fitting plane to obtain the rotation transformation of the camera coordinate system relative to the calibration plane.
Preferably, the depth image data includes elevation image data perpendicular to the bearing surface of the mobile robot body,
obtaining an equation of a calibration plane associated with the fitting plane according to the fitting plane normal vector, wherein the equation comprises;
obtaining an equation of a calibration plane which is parallel to the fitting plane and has a set distance according to the normal vector of the fitting plane; the calibration plane is vertical to the bearing ground of the robot body under the world coordinate system;
calculating the distance between the origin of the camera coordinate system and the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane, wherein the distance transformation comprises the following steps;
calculating the distance from the origin of the camera coordinate system to the calibration plane according to the equation parameters of the calibration plane to obtain the distance transformation from the camera coordinate system to the calibration plane;
the calculation of the rotation of the camera coordinate system with respect to the front view of the calibration plane to the rotational transformation of the camera coordinate system with respect to the calibration plane comprises,
and calculating the rotation of the z axis of the camera coordinate system relative to the normal vector of the calibration plane according to the equation parameters of the calibration plane to obtain the rotation transformation of the camera coordinate system relative to the calibration plane.
The invention provides an electronic device for calibrating external parameters of a depth camera, which comprises a memory and a processor, wherein,
the memory is used for storing computer programs;
the processor is used for executing the program stored in the memory and realizing the calibration method of the external parameter of any depth camera.
The invention also provides a computer readable storage medium, wherein a computer program is stored in the storage medium, and the computer program is executed by a processor to perform any one of the above calibration methods for the external parameter of the depth camera.
According to the method for calibrating the external parameter of the depth camera, the fitting plane of the calibration plane is obtained based on the space three-dimensional points under the camera coordinate system corresponding to the depth image data, so that the distance change and the rotation change between the camera coordinate system and the calibration plane are obtained according to the current relative position and posture of the camera coordinate system and the fitting plane, and the external parameter of the camera relative to the calibration plane is obtained. According to the method and the device, the calibration result can be given only by acquiring one image without any external tool in advance in the calibration process, so that when one image is acquired in real time, the calibration can be performed in real time by means of natural horizontal and vertical ground or wall, the method and the device are simple and easy to use, high in real-time performance and applicability, and an application basis with good robustness and high real-time performance is provided for application based on depth image data.
Drawings
Fig. 1 is a schematic diagram of a principle of external parameter calibration of a depth camera based on depth image data.
Fig. 2 is a schematic flow chart of the deep camera external parameter calibration according to the embodiment of the present disclosure.
FIG. 3 is a schematic diagram of calibration with a fitted plane.
FIG. 4 is a schematic illustration of calibration with a second fitting plane (wall) perpendicular to the fitting plane.
Fig. 5 is a schematic diagram of data association of a calibration method according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a calibration apparatus according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical means and advantages of the present application more apparent, the present application will be described in further detail with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating a principle of depth camera external reference calibration based on depth image data. The method comprises the steps that a depth camera installed on a mobile robot body obtains depth image data, one type of depth image is shown in the figure, the depth image corresponding to the physical space ground (indicated By thick solid lines in the figure) in the figure is a dotted-line frame area in the figure, a fitting plane Ax + By + Cz + D is solved to be 0 based on the depth image data of the space ground, and the fitting plane can be regarded as a plane representing the space ground; in view of the fact that a world coordinate system is generally based on the ground, a plane where an xy axis is located is parallel to a ground plane, a z axis is perpendicular to the plane where the xy axis is located, and translation tracks (x and y coordinate changes) of a camera belong to the same plane, therefore, the rotation and height (distance) of the camera relative to the ground are calibrated and are equivalent to the rotation and height (distance) of the camera relative to a fitting plane, and the rotation transformation and z axis transformation of the camera relative to the world coordinate system are obtained, so that parameters of an imaging model between a ground depth image and a physical space ground plane (horizontal plane) are obtained; broadly, imaging model parameters between the depth image and any one of the desired planes in the physical space parallel to or perpendicular to the bearing surface where the mobile robot body is located, that is, imaging model parameters between the depth image and the calibration plane, which reflect an external reference relationship between the depth camera and the external coordinate system, can be regarded as an external reference of the depth camera, which is referred to as the external reference of the depth camera in this application.
Based on the calibration principle, the external parameter calibration of the depth camera is based on the depth image acquired by the depth camera, the depth image is converted into three-dimensional points under a camera coordinate system, a fitting plane associated with a calibration plane is fitted by the three-dimensional points, and the external parameter of the depth camera is acquired through the current pose relation between the fitting plane and the depth camera.
Referring to fig. 2, fig. 2 is a schematic flow chart of performing depth camera external reference calibration according to an embodiment of the present disclosure.
Step 201, acquiring any depth image data output by a depth camera, and projecting the depth image into a three-dimensional point cloud through camera internal parameters, namely projecting the depth image into a set of three-dimensional points.
In view of the calibration principle, a fitting plane associated with the calibration plane can be fitted by using three-dimensional points of the depth image, and the external parameters of the depth camera can be obtained through the current pose relationship between the fitting plane and the depth camera, so that the depth image output by the depth camera only needs to use at least one depth image, and for any pixel point p (u, v, d) on the depth image, wherein u and v are coordinates of the pixel point in an image coordinate system, d is the pixel point, and the gray value is also the depth value of the actual three-dimensional point:
obtaining a relational expression for converting pixel points in the depth image into three-dimensional points pc (x, y, z) in a camera coordinate system according to a mapping geometric model, such as a pinhole model, of the three-dimensional coordinate points and a two-dimensional image plane (the unit is pixel):
taking the depth value of the pixel point as the z coordinate value of the three-dimensional point; acquiring a first difference value between an x coordinate of a pixel point and an x-direction offset, wherein the x-direction offset is the offset of an origin (camera optical center) of a camera coordinate system relative to an image coordinate system in the x direction, and multiplying the ratio of a z coordinate value to a focal length in the x direction of camera internal parameters by the first difference value to obtain a result as the x coordinate value of a three-dimensional point;
acquiring a second difference value between the y coordinate of the pixel point and the y direction offset, wherein the y direction offset is the offset of the origin of the coordinate system of the camera relative to the y direction of the coordinate system of the image, multiplying the ratio of the z coordinate value to the focal length of the camera internal parameter in the y direction by the second difference value to obtain a result as the y coordinate value of the three-dimensional point,
the camera coordinate system origin is deviated relative to the x direction in the image coordinate system, and the camera coordinate system origin is deviated relative to the y direction in the image coordinate system according to camera internal parameters;
expressed mathematically as:
Figure BDA0002209215940000071
wherein f isx、fyAre respectively camera internal parameters cx、cyThese parameters correspond to the data in the camera reference matrix K for the offset in the camera coordinate system origin in the camera coordinate system from the image coordinate system:
Figure BDA0002209215940000072
according to the method, each pixel point in the depth image can be converted into a three-dimensional point under a camera coordinate system, and an obtained three-dimensional point set is also called as point cloud.
Step 202, screening the three-dimensional point cloud, and removing the three-dimensional points with larger errors so as to improve the accuracy and the success rate of obtaining the fitting plane.
Taking the ground as an example, the three-dimensional point cloud calculated in step 201 is screened in this step, so as to primarily remove three-dimensional points which are obviously not on the ground and three-dimensional points with larger errors.
The point cloud screening strategy can add conditions and standards for point cloud screening according to different use scenes. Several point cloud screening conditions are provided, each screening condition has its own applicable condition, and in the actual use process, one or any combination of the following screening conditions can be provided:
(1) a range of three-dimensional points above the depth image are removed according to the camera orientation.
For example: when the camera is close to the flat view, the three-dimensional points of the upper half part of the depth image do not appear on the ground, so that the 1/2 of the image is used as a threshold value, and the three-dimensional points converted by the pixel points of the upper half part of the depth image are removed.
(2) Under the condition that the external reference initial estimation value exists, the three-dimensional points in the camera coordinate system can be transferred to the world coordinate system through the external reference of the camera, and the points with the height larger than the preset height threshold value in the height direction are removed.
(3) If the principle of the depth camera is to use binocular stereo vision, the depth measurement accuracy may decrease as the distance increases due to the inherent characteristics of the binocular measurement principle. That is, since the depth accuracy of a three-dimensional point at a distance is deteriorated, a certain depth direction distance threshold (depth threshold) may be set, and a point exceeding the depth threshold may be removed.
(4) If the principle of the depth camera is that a TOF method is adopted, the camera is easily influenced by multipath reflection due to the characteristics of TOF sensing, more flying spots possibly exist in a depth image, and points with distance differences larger than a preset distance threshold value can be removed according to the distance differences between a pixel point and adjacent pixel points.
Step 203, acquiring a fitting plane based on the screened three-dimensional point cloud;
in this step, one of the embodiments, a fitting plane is obtained using a Random Sample Consensus (RANSAC) algorithm. The method specifically comprises the following steps:
step 2031, based on the screened three-dimensional point cloud, performing random selection to obtain a subset formed by the randomly selected three-dimensional points, wherein the number of the three-dimensional points is equal to the unknown number of the fitting plane, and the fitting plane equation can be expressed as: ax + By + Cz + D is 0, can be deformed into Ax + By + Cz is 1,
thus, three unknowns a, b, c are included, so the number of three-dimensional points in the subset is at least 3.
Step 2032, based on the three-dimensional points in the subset, generating a fitting plane estimate for the subset: substituting the three-dimensional points in the selected subset into a plane equation, and solving an unknown number in a fitting plane estimation equation to obtain estimation of a fitting plane;
if the number m of three-dimensional points in the selected subset is greater than the number of unknowns, an over-determined equation set needs to be solved, and the unknowns in the fitted plane estimation equation can be solved by using regression analysis, for example, using a linear least squares method, which is specifically as follows:
the fitted plane equation is deformed as: ax + by + cz is 1, then:
Figure BDA0002209215940000091
when matrix
Figure BDA0002209215940000092
Non-singularity, the unknowns a, b, c have unique solutions, where xm、ym、zmCoordinates of three-dimensional points.
Step 2033, based on the fitting plane estimation, obtaining the coincidence degree of all screened three-dimensional points relative to the fitting plane estimation;
one of the embodiments is to calculate an interior point rate of the fitting plane estimation, and determine the coincidence degree according to the interior point rate, specifically:
calculating the estimated distance from each point in the screened three-dimensional point cloud to the fitting plane,
taking the point with the calculated distance value smaller than the set distance threshold value as an inner point,
counting the number of the inner points,
and calculating the proportion of the number of the inner points to the screened three-dimensional point cloud to obtain the inner point rate, wherein the larger the proportion is, the higher the inner point rate is, the higher the conformity degree is, and the better the fitting plane estimation is.
Step 2034, judging whether the coincidence degree is satisfied, if yes, executing step 2035, otherwise, returning to step 2031, so as to randomly select the subset again to perform fitting plane estimation, thereby performing an estimation-confirmation cycle;
in this step, one of the embodiments determines whether the interior point rate satisfies a predetermined condition,
in the second embodiment, in order to perform at least one random selection during the iterative loop under the condition of the confidence η, the selected m points are all interior points, which is advantageous for obtaining the best value of the fitting plane estimation at least once during the loop. Therefore, the number of iterations i should satisfy the following condition:
Figure BDA0002209215940000093
where m is the size of the subset, i.e., the number of three-dimensional points in the subset; the confidence coefficient is generally set within the range of 0.95-0.99. Epsilon is the interior point rate, and in general, epsilon is usually unknown, so that the proportion of interior points under the worst condition can be taken or set to be the proportion under the worst condition in the initial state, and then the current maximum interior point rate is continuously updated along with the iteration number.
In a third embodiment, it is determined whether the probability that all subsets are inliers meets the requirement of the required confidence, specifically, the selected subset is regarded as a binomial distribution of two results, i.e., "all are inliers" or "not all are inliers", and the probability of the selected subset is p ═ 1/epsilonm. For p to be small enough, it can be considered as a poisson distribution, so in i cycles, the probability of v "subsets are all inliers" can be expressed as:
Figure BDA0002209215940000101
where λ represents the expectation of the number of picks for "subsets are all inliers" in i cycles.
For example, it is desirable that the probability that the selected subset "none of all inliers" is less than a certain confidence in the i iteration cycles, i.e.: p (0, λ) ═ e< 1- η, with a 95% confidence, for example, λ is approximately equal to 3, meaning that at 95% confidence, in i cycles, the average can be selected to be 3 timesA subset of "good".
Step 2035, solving a fitting plane equation by using the inner points estimated by the best fitting plane;
specifically, based on the interior points estimated by the fitting plane with the highest interior point rate, the fitting plane estimation is solved again by a least square method to obtain a final fitting plane equation.
And step 204, calculating the distance from the origin of the camera coordinate system to the fitting plane and the rotation of the camera based on the parameters of the fitting plane.
Referring to FIG. 3, FIG. 3 is a schematic diagram of calibration using the fitting plane as the calibration plane, wherein symbols are shown
Figure BDA0002209215940000102
Representing the cross product of the vector, and the fitting plane is the bearing ground of the robot body under the world coordinate system. The distance from the origin of the camera coordinate system to the fitting plane is height transformation of the camera, and the rotation of the camera coordinate system relative to the front view of the fitting plane corresponds to the rotation of the Y axis of the camera coordinate system relative to the normal vector of the fitting plane; therefore, from the fitting plane equation Ax + By + Cz + D being 0, it can be obtained:
the normal vector n of the fitting plane (ground) is (A, B, C);
distance of camera coordinate system to fitting plane
Figure BDA0002209215940000103
Rotational transformation of camera coordinate system
Figure BDA0002209215940000104
Where the symbol x represents the cross product of the vector.
The above embodiment may be extended to obtain any equation of the calibration plane parallel to the fitting plane and having a set distance from the fitting plane according to the normal vector of the fitting plane, and thus obtain distance transformation and rotation transformation of the camera coordinate system with respect to the calibration plane, so as to obtain a parameter between the depth camera and the calibration plane.
If the depth image is related to a vertical surface (e.g., a wall surface) perpendicular to the bearing surface of the mobile robot body, for example, the depth image of the wall surface is acquired, calibration can be performed by using the vertical surface. Referring to fig. 4, fig. 4 is a schematic diagram of calibration performed with the fitting plane as a vertical plane, in which a normal vector n is opposite to the front view direction of the camera coordinate system relative to the calibration plane, and is therefore-n. The distance from the origin of the camera coordinate system to the wall surface is distance transformation of the camera, and the rotation of the camera coordinate system to the front view of the first calibration plane corresponds to the rotation of the z-axis of the camera coordinate system relative to the normal vector of the wall surface; therefore, from the equation a 'x + B' y + C 'z + D' of the calibration plane being 0, one can obtain:
demarcating the plane normal vector-n ═ (A ', B ', C ')
Distance from camera coordinate system to calibration plane
Figure BDA0002209215940000111
Rotational transformation of camera coordinate system
Figure BDA0002209215940000112
Where coincidence x represents a cross product of the vectors.
The first calibration plane equation may be obtained through step 201 and 203 based on the depth image.
The above embodiment may be extended to obtain any equation of the calibration plane parallel to the fitting plane and having a set distance from the fitting plane according to the normal vector of the fitting plane, and thus obtain distance transformation and rotation transformation of the camera coordinate system with respect to the calibration plane, so as to obtain parameters between the depth camera and the calibration plane.
According to different requirements, for example, when the obtained depth image comprises an image of a bearing surface of a mobile robot body, the depth camera external reference calibration method can perform calibration by means of a ground plane, and when the obtained depth image comprises wall surface image data vertical to the bearing surface of the mobile robot body, the depth camera external reference calibration method can perform calibration by means of the wall surface without any external tool; the method has no requirement on the principle of the depth camera, and can be suitable for external parameter calibration of the depth camera based on binocular, structured light or TOF principle; the calibration method is simple to operate, high in calibration precision and strong in robustness; the calibration can be carried out only by one depth image, the calculation complexity of the calibration method is low, and the calibration can be carried out in real time; through the calibration of the external reference, the three-dimensional information under the camera coordinate system can be converted into the world coordinate system, the operations such as ground removal, three-dimensional obstacle perception and the like can be conveniently carried out, and the external reference has strong application value.
Referring to fig. 5, fig. 5 is a schematic diagram of data association in a calibration method according to an embodiment of the present application. Converting pixel points in the depth image into space three-dimensional points in a camera coordinate system based on the depth image data; in order to improve the accuracy and success rate of obtaining the fitting plane, the spatial three-dimensional points can be screened; when the RANSAC algorithm is adopted, randomly extracting three-dimensional points as subsets, obtaining fitting plane estimation of the subsets based on the subsets, and selecting the best fitting plane estimation according to the condition of the maximum interior point rate through the subsets randomly extracted for multiple times of the algorithm and the fitting plane estimation of each subset; recalculating the fitting plane based on the inner points of the best fitting plane; and if the fitting plane is taken as the calibration plane, calculating the height transformation and the rotation transformation of the camera coordinate system relative to the fitting plane according to the fitting plane.
Referring to fig. 6, fig. 6 is a schematic view of a calibration device according to an embodiment of the present application. The device comprises a plurality of devices which are connected with each other,
the depth image acquisition module is used for acquiring depth image data, and the depth image data comprises pixel point coordinates and depth values;
the conversion module is used for converting pixel points in the depth image into space three-dimensional points in a camera coordinate system based on the acquired depth image data;
the fitting plane obtaining module is used for obtaining a fitting plane of the three-dimensional points based on the three-dimensional points;
the transformation calculation module is used for obtaining parameters between the depth camera and the calibration plane according to the current pose relation between the calibration plane parallel to or coincident with the fitting plane and the camera coordinate system;
the calibration plane comprises any current plane which is parallel or vertical to the bearing surface of the mobile robot body where the depth camera is located.
The device also comprises a screening module which screens the three-dimensional points in the three-dimensional point set according to a screening strategy to obtain a screened three-dimensional point set.
The fitting plane acquisition module comprises a fitting plane acquisition module,
the random selection module is used for carrying out random selection on the basis of the screened three-dimensional point set to obtain a current subset formed by the randomly selected three-dimensional points, wherein the number of the three-dimensional points in the subset at least comprises 3;
the fitting plane estimation module is used for acquiring the fitting plane estimation of the subset based on the three-dimensional points in the current subset;
the coincidence degree acquisition module is used for acquiring the coincidence degree of all three-dimensional points in the screened three-dimensional point set relative to the fitting plane estimation according to the acquired fitting plane estimation;
if the coincidence degree is not enough, returning to the random selection module;
if the coincidence degree is reached, returning to the re-solving module;
the interior points comprise three-dimensional points, wherein the distance from the three-dimensional points in the screened three-dimensional point set to the fitting plane with the best conformity degree is smaller than a preset distance threshold value.
And the re-solution module is used for solving the fitting plane equation by using the inner points of the fitting plane estimation with the best conformity degree.
The fitting plane estimation module also comprises a step of substituting the coordinate values of the three-dimensional points into a fitting plane equation if the number of the three-dimensional points in the current subset is equal to 3, and solving the unknown number in the fitting plane equation to obtain the fitting plane estimation of the current subset;
and if the number of the three-dimensional points in the current subset is more than 3, substituting the coordinate values of the three-dimensional points into a fitting plane equation, and solving the unknown number in the fitting plane equation by a least square method to obtain the fitting plane estimation of the current subset.
The conformity degree obtaining module also comprises a distance value which is calculated from each three-dimensional point in the screened three-dimensional point set to the estimated fitting plane,
taking the three-dimensional point with the calculated distance value smaller than the set distance threshold value as an inner point,
counting the number of the inner points,
calculating the proportion of the number of the inner points to the screened three-dimensional point cloud to obtain the inner point rate;
determining the coincidence degree according to the interior point rate;
alternatively, the first and second electrodes may be,
determining the coincidence degree according to the iteration times, wherein the iteration times meet the following conditions:
Figure BDA0002209215940000131
wherein m is the number of three-dimensional points in the subset, η is a set confidence coefficient, and epsilon is an interior point rate which is a proportion of interior points under the worst condition, or the interior point rate is set under the worst condition in an initial state and is updated to the current maximum interior point rate along with the iteration times;
alternatively, the first and second electrodes may be,
determining the coincidence degree according to whether the probability that all the v subsets are interior points meets the set confidence level, wherein the probability that all the v subsets are interior points is as follows:
Figure BDA0002209215940000132
where λ is the expectation of the selected number of times that the subset is all inliers in the current iteration.
The re-solving module is used for re-solving the unknown number in the fitting plane equation by a least square method according to the inner point estimated by the fitting plane with the highest inner point rate to obtain the fitting plane equation.
The transformation calculation module comprises a fitting plane normal vector is obtained according to the fitting plane equation,
obtaining an equation of a calibration plane parallel to or coincident with the fitting plane according to the fitting plane normal vector;
calculating the distance between the origin of the camera coordinate system and the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane;
calculating the rotation of the camera coordinate system relative to the front view of the calibration plane to the rotation transformation of the camera coordinate system relative to the calibration plane,
and taking the distance transformation and the rotation transformation as parameters between the depth camera and a calibration plane.
The depth image data comprises image data of a bearing surface of a mobile robot body, the transformation calculation module further comprises,
obtaining an equation of a calibration plane which is parallel to the fitting plane and has a set distance according to the fitting plane normal vector; the calibration plane is the bearing ground of the robot body under the world coordinate system;
calculating the distance from the origin of the camera coordinate system to the fitting plane according to the equation parameters of the calibration plane to obtain the height transformation of the camera coordinate system relative to the calibration plane; calculating the rotation of the y-axis of the camera coordinate system relative to the normal vector of the fitting plane to obtain the rotation transformation of the camera coordinate system relative to the calibration plane,
and taking the height transformation and the rotation transformation as parameters between the depth camera and a calibration plane.
The depth image data comprises vertical face image data vertical to the bearing surface of the mobile robot body, and the transformation calculation module further comprises an equation of a calibration plane which is parallel to the fitting plane and has a set distance according to a normal vector of the fitting plane; the calibration plane is vertical to the bearing ground of the robot body under the world coordinate system;
calculating the distance from the origin of the camera coordinate system to the second fitting plane according to the equation parameters of the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane; calculating the rotation of the z-axis of the camera coordinate system relative to the normal vector of the second fitting plane to obtain the rotation transformation of the camera coordinate system relative to the calibration plane,
and taking the distance transformation and the rotation transformation as parameters between the depth camera and a calibration plane.
The present application further provides an electronic device for external parameter calibration of a depth camera, the electronic device comprising a memory and a processor, wherein,
the memory is used for storing computer programs;
and the processor is used for executing the program stored in the memory and realizing the calibration step of any depth camera external parameter.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
An embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the following steps:
acquiring depth image data, wherein the depth image data comprises pixel point coordinates and depth values;
converting pixel points in the depth image into space three-dimensional points under a camera coordinate system based on the acquired depth image data;
acquiring a fitting plane associated with the calibration plane based on the three-dimensional points;
obtaining parameters between the depth camera and the calibration plane according to the current pose relationship between the fitting plane and the camera coordinate system;
the calibration plane comprises any current plane which is parallel or vertical to the bearing surface of the mobile robot body where the depth camera is located.
For the device/network side device/storage medium embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and for the relevant points, refer to the partial description of the method embodiment.
It should be noted that the embodiments of the depth camera external reference calibration method provided in the present application may not be limited to the above-described embodiments, for example, when obtaining the fitting plane, a fitting algorithm other than the RANSAC algorithm may be adopted, and when solving the over-determined equation, a solving method other than the least square method may also be adopted, for example, a regression calculation or the like is adopted. In addition, in order to reduce the computational complexity of the over-determined equation, the method can also improve the reliability of the targeted selection of the reliable three-dimensional points, for example, the characteristic points of the images and the like.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (15)

1. A calibration method of external parameters of a depth camera is characterized by comprising the following steps,
acquiring depth image data, wherein the depth image data comprises pixel point coordinates and depth values;
converting pixel points in the depth image into space three-dimensional points under a camera coordinate system based on the acquired depth image data;
acquiring a fitting plane of the three-dimensional points based on the three-dimensional points;
obtaining parameters between the depth camera and the calibration plane according to the current pose relation between the calibration plane parallel to or coincident with the fitting plane and the camera coordinate system;
the calibration plane comprises any current plane parallel to or perpendicular to the bearing surface of the mobile robot body where the depth camera is located.
2. The method of claim 1, wherein the depth image data is at least one depth image, and wherein converting pixel points in the depth image into spatial three-dimensional points in a camera coordinate system comprises,
and obtaining the three-dimensional point coordinates corresponding to the pixel points according to the mapping geometric model of the space three-dimensional point under the camera coordinate system and the two-dimensional image under the image coordinate system.
3. The method of claim 2, wherein obtaining three-dimensional point coordinates corresponding to pixel points according to a mapping geometric model of spatial three-dimensional points in a camera coordinate system and two-dimensional images in an image coordinate system comprises,
for any pixel point in the depth image,
taking the depth value of the pixel point as the z coordinate value of the three-dimensional point;
acquiring a first difference value between an x coordinate of a pixel point and x direction offset, and multiplying the ratio of a z coordinate value to a focal length in the x direction in camera internal parameters by the first difference value to obtain a result as the x coordinate value of the three-dimensional point; the x-direction deviation is the deviation of the origin of the camera coordinate system relative to the image coordinate system in the x direction;
acquiring a second difference value between the y coordinate of the pixel point and the y direction offset, and multiplying the ratio of the z coordinate value to the y direction focal length in the camera internal parameter by the second difference value to obtain a result as the y coordinate value of the three-dimensional point; the y-direction deviation is the deviation of the origin of the camera coordinate system relative to the image coordinate system in the y direction;
the camera coordinate system origin is deviated relative to the x direction in the image coordinate system, and the camera coordinate system origin is deviated relative to the y direction in the image coordinate system according to camera internal parameters;
and converting all pixel points in the depth image into three-dimensional point coordinates to obtain a three-dimensional point set.
4. The method of claim 3, further comprising, according to a screening strategy, screening three-dimensional points in the set of three-dimensional points to obtain a screened set of three-dimensional points; wherein the screening strategy comprises any one or any combination of the following conditions:
(1) removing three-dimensional points in a certain range above the depth image according to the orientation of the camera;
(2) under the condition that an external reference initial estimation value exists, three-dimensional points under a camera coordinate system are converted to a world coordinate system through the external reference of a camera, and three-dimensional points with height larger than a preset height threshold value in the height direction are removed;
(3) for a binocular stereoscopic vision depth camera, removing three-dimensional points with depth values larger than a preset depth threshold;
(4) and for the time of flight TOF depth camera, removing the three-dimensional points of which the distance difference is greater than a preset distance threshold according to the distance difference between the pixel point and the adjacent pixel point.
5. The method of claim 4, wherein said obtaining a fitted plane of three-dimensional points based on said three-dimensional points comprises,
and acquiring a fitting plane equation of the three-dimensional points by utilizing a random sample consensus (RANSAC) algorithm according to the screened three-dimensional point set, wherein the number of the three-dimensional points is more than or equal to 3.
6. The method of claim 5, wherein obtaining the fitted plane equation of the three-dimensional points using a random sample consensus (RANSAC) algorithm comprises,
based on the screened three-dimensional point set, carrying out random selection to obtain a current subset formed by the randomly selected three-dimensional points, wherein the number of the three-dimensional points in the subset at least comprises 3;
obtaining a fitting plane estimate of the subset based on the three-dimensional points in the current subset;
according to the obtained fitting plane estimation, the coincidence degree of all three-dimensional points in the screened three-dimensional point set relative to the fitting plane estimation is obtained;
if the coincidence degree is not enough, returning to the step of randomly selecting based on the screened three-dimensional point set;
if the coincidence degree is reached, solving a fitting plane equation by using the inner point of the fitting plane estimation with the best coincidence degree;
the interior points comprise three-dimensional points, wherein the distance from the three-dimensional points in the screened three-dimensional point set to the fitting plane with the best conformity degree is smaller than a preset distance threshold value.
7. The method of claim 6, wherein obtaining the fitted plane estimate for the current subset based on three-dimensional points in the subset comprises,
if the number of the three-dimensional points in the current subset is equal to 3, substituting the coordinate values of the three-dimensional points into a fitting plane equation, and solving the unknown number in the fitting plane equation to obtain the fitting plane estimation of the current subset;
and if the number of the three-dimensional points in the current subset is more than 3, substituting the coordinate values of the three-dimensional points into a fitting plane equation, and solving the unknown number in the fitting plane equation by a least square method to obtain the fitting plane estimation of the current subset.
8. The method of claim 6, wherein obtaining, based on the obtained fit plane estimate, a degree of conformance of all three-dimensional points in the filtered set of three-dimensional points with respect to the fit plane estimate comprises,
calculating an estimated distance value from each three-dimensional point in the screened three-dimensional point set to the fitting plane,
taking the three-dimensional point with the calculated distance value smaller than the set distance threshold value as an inner point,
counting the number of the inner points,
calculating the proportion of the number of the inner points to the screened three-dimensional point cloud to obtain the inner point rate;
determining the coincidence degree according to the interior point rate;
solving the fitting plane equation by using the inner points of the fitting plane estimation with the best conformity degree comprises taking the fitting plane estimation with the highest inner point rate as the best fitting plane estimation, and solving the unknowns in the fitting plane equation again by a least square method according to the inner points of the best fitting plane estimation to obtain the fitting plane equation.
9. The method of claim 8, wherein said determining said degree of conformance from an inlier rate comprises,
determining the coincidence degree according to the iteration times, wherein the iteration times meet the following conditions:
Figure FDA0002209215930000031
wherein m is the number of three-dimensional points in the subset, η is a set confidence coefficient, and epsilon is an interior point rate which is a proportion of interior points under the worst condition, or the interior point rate is set under the worst condition in an initial state and is updated to the current maximum interior point rate along with the iteration times.
10. The method of claim 8, wherein the determining the degree of correspondence according to the inlier ratio comprises determining the degree of correspondence according to whether probabilities that v subsets are all inliers satisfy a set confidence level, wherein the probabilities that v subsets are all inliers are:
Figure FDA0002209215930000032
where λ is the expectation of the selected number of times that the subset is all inliers in the current iteration.
11. The method according to any one of claims 1 to 10, wherein the obtaining of the parameters between the depth camera and the calibration plane is based on the current pose relationship of the calibration plane parallel to or coincident with the fitting plane and the camera coordinate system, including,
obtaining a fitting plane normal vector according to the fitting plane equation,
obtaining an equation of a calibration plane parallel to or coincident with the fitting plane according to the fitting plane normal vector;
calculating the distance between the origin of the camera coordinate system and the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane;
calculating the rotation of the camera coordinate system relative to the front view of the calibration plane to the rotation transformation of the camera coordinate system relative to the calibration plane,
and taking the distance transformation and the rotation transformation as parameters between the depth camera and a calibration plane.
12. The method of claim 11, wherein the depth image data includes mobile robot body bearing surface image data,
obtaining an equation of a calibration plane associated with the fitting plane according to the fitting plane normal vector, wherein the equation comprises;
obtaining an equation of a calibration plane which is parallel to the fitting plane and has a set distance according to the fitting plane normal vector; the calibration plane is the bearing ground of the robot body under the world coordinate system;
calculating the distance between the origin of the camera coordinate system and the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane, wherein the distance transformation comprises the following steps;
calculating the distance between the origin of the camera coordinate system and the calibration plane according to the equation parameters of the calibration plane to obtain the height transformation of the camera coordinate system relative to the calibration plane;
the calculation of the rotation of the camera coordinate system with respect to the front view of the calibration plane to the rotational transformation of the camera coordinate system with respect to the calibration plane comprises,
and according to the equation parameters of the calibration plane, calculating the rotation of the y axis of the camera coordinate system relative to the normal vector of the fitting plane to obtain the rotation transformation of the camera coordinate system relative to the calibration plane.
13. The method of claim 11, wherein the depth image data includes elevation image data perpendicular to the bearing surface of the mobile robot body,
obtaining an equation of a calibration plane associated with the fitting plane according to the fitting plane normal vector, wherein the equation comprises;
obtaining an equation of a calibration plane which is parallel to the fitting plane and has a set distance according to the normal vector of the fitting plane; the calibration plane is vertical to the bearing ground of the robot body under the world coordinate system;
calculating the distance between the origin of the camera coordinate system and the calibration plane to obtain the distance transformation of the camera coordinate system relative to the calibration plane, wherein the distance transformation comprises the following steps;
calculating the distance from the origin of the camera coordinate system to the calibration plane according to the equation parameters of the calibration plane to obtain the distance transformation from the camera coordinate system to the calibration plane;
the calculation of the rotation of the camera coordinate system with respect to the front view of the calibration plane to the rotational transformation of the camera coordinate system with respect to the calibration plane comprises,
and calculating the rotation of the z axis of the camera coordinate system relative to the normal vector of the calibration plane according to the equation parameters of the calibration plane to obtain the rotation transformation of the camera coordinate system relative to the calibration plane.
14. An electronic device for calibrating external parameters of a depth camera, comprising a memory and a processor, wherein,
the memory is used for storing computer programs;
the processor is configured to execute the program stored in the memory to implement the method for calibrating the external parameter of the depth camera according to any one of claims 1 to 13.
15. A computer-readable storage medium, wherein the storage medium has a computer program stored therein, and the computer program is executed by a processor to perform the method for calibrating the external parameter of the depth camera according to any one of claims 1 to 13.
CN201910892567.7A 2019-09-20 2019-09-20 Method and device for calibrating external parameter of depth camera Pending CN112541950A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910892567.7A CN112541950A (en) 2019-09-20 2019-09-20 Method and device for calibrating external parameter of depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910892567.7A CN112541950A (en) 2019-09-20 2019-09-20 Method and device for calibrating external parameter of depth camera

Publications (1)

Publication Number Publication Date
CN112541950A true CN112541950A (en) 2021-03-23

Family

ID=75012324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910892567.7A Pending CN112541950A (en) 2019-09-20 2019-09-20 Method and device for calibrating external parameter of depth camera

Country Status (1)

Country Link
CN (1) CN112541950A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967347A (en) * 2021-03-30 2021-06-15 深圳市优必选科技股份有限公司 Pose calibration method and device, robot and computer readable storage medium
CN113284197A (en) * 2021-07-22 2021-08-20 浙江华睿科技股份有限公司 TOF camera external reference calibration method and device for AGV, and electronic equipment
CN113340310A (en) * 2021-07-08 2021-09-03 深圳市人工智能与机器人研究院 Step terrain identification and positioning method for mobile robot and related device
CN113689391A (en) * 2021-08-16 2021-11-23 炬佑智能科技(苏州)有限公司 ToF device installation parameter acquisition method and system and ToF device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103646389A (en) * 2013-03-26 2014-03-19 中国科学院电子学研究所 SAR slant range image match automatic extraction method based on geometric model
CN104156972A (en) * 2014-08-25 2014-11-19 西北工业大学 Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras
CN104376558A (en) * 2014-11-13 2015-02-25 浙江大学 Cuboid-based intrinsic parameter calibration method for Kinect depth camera
JP2017118396A (en) * 2015-12-25 2017-06-29 Kddi株式会社 Program, device and method for calculating internal parameter of depth camera
CN107146256A (en) * 2017-04-10 2017-09-08 中国人民解放军国防科学技术大学 Camera marking method under outfield large viewing field condition based on differential global positioning system
CN107945234A (en) * 2016-10-12 2018-04-20 杭州海康威视数字技术股份有限公司 A kind of definite method and device of stereo camera external parameter
CN108280853A (en) * 2018-01-11 2018-07-13 深圳市易成自动驾驶技术有限公司 Vehicle-mounted vision positioning method, device and computer readable storage medium
CN108416791A (en) * 2018-03-01 2018-08-17 燕山大学 A kind of monitoring of parallel institution moving platform pose and tracking based on binocular vision
CN108629756A (en) * 2018-04-28 2018-10-09 东北大学 A kind of Kinect v2 depth images Null Spot restorative procedure
CN109544677A (en) * 2018-10-30 2019-03-29 山东大学 Indoor scene main structure method for reconstructing and system based on depth image key frame
CN110111248A (en) * 2019-03-15 2019-08-09 西安电子科技大学 A kind of image split-joint method based on characteristic point, virtual reality system, camera

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103646389A (en) * 2013-03-26 2014-03-19 中国科学院电子学研究所 SAR slant range image match automatic extraction method based on geometric model
CN104156972A (en) * 2014-08-25 2014-11-19 西北工业大学 Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras
CN104376558A (en) * 2014-11-13 2015-02-25 浙江大学 Cuboid-based intrinsic parameter calibration method for Kinect depth camera
JP2017118396A (en) * 2015-12-25 2017-06-29 Kddi株式会社 Program, device and method for calculating internal parameter of depth camera
CN107945234A (en) * 2016-10-12 2018-04-20 杭州海康威视数字技术股份有限公司 A kind of definite method and device of stereo camera external parameter
CN107146256A (en) * 2017-04-10 2017-09-08 中国人民解放军国防科学技术大学 Camera marking method under outfield large viewing field condition based on differential global positioning system
CN108280853A (en) * 2018-01-11 2018-07-13 深圳市易成自动驾驶技术有限公司 Vehicle-mounted vision positioning method, device and computer readable storage medium
CN108416791A (en) * 2018-03-01 2018-08-17 燕山大学 A kind of monitoring of parallel institution moving platform pose and tracking based on binocular vision
CN108629756A (en) * 2018-04-28 2018-10-09 东北大学 A kind of Kinect v2 depth images Null Spot restorative procedure
CN109544677A (en) * 2018-10-30 2019-03-29 山东大学 Indoor scene main structure method for reconstructing and system based on depth image key frame
CN110111248A (en) * 2019-03-15 2019-08-09 西安电子科技大学 A kind of image split-joint method based on characteristic point, virtual reality system, camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙士杰 等: "点云下地平面检测的RGB-D相机外参自动标定", 《中国图象图形学报》, vol. 23, no. 6, pages 866 - 873 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967347A (en) * 2021-03-30 2021-06-15 深圳市优必选科技股份有限公司 Pose calibration method and device, robot and computer readable storage medium
WO2022205845A1 (en) * 2021-03-30 2022-10-06 深圳市优必选科技股份有限公司 Pose calibration method and apparatus, and robot and computer-readable storage medium
CN112967347B (en) * 2021-03-30 2023-12-15 深圳市优必选科技股份有限公司 Pose calibration method, pose calibration device, robot and computer readable storage medium
CN113340310A (en) * 2021-07-08 2021-09-03 深圳市人工智能与机器人研究院 Step terrain identification and positioning method for mobile robot and related device
CN113340310B (en) * 2021-07-08 2024-03-15 深圳市人工智能与机器人研究院 Step terrain identification and positioning method and relevant device for mobile robot
CN113284197A (en) * 2021-07-22 2021-08-20 浙江华睿科技股份有限公司 TOF camera external reference calibration method and device for AGV, and electronic equipment
CN113284197B (en) * 2021-07-22 2021-11-23 浙江华睿科技股份有限公司 TOF camera external reference calibration method and device for AGV, and electronic equipment
CN113689391A (en) * 2021-08-16 2021-11-23 炬佑智能科技(苏州)有限公司 ToF device installation parameter acquisition method and system and ToF device

Similar Documents

Publication Publication Date Title
CN112541950A (en) Method and device for calibrating external parameter of depth camera
CN107230225B (en) Method and apparatus for three-dimensional reconstruction
CN104040590B (en) Method for estimating pose of object
JP5759161B2 (en) Object recognition device, object recognition method, learning device, learning method, program, and information processing system
CN110176032B (en) Three-dimensional reconstruction method and device
JP6571225B2 (en) Camera posture estimation method and system
CN111144349B (en) Indoor visual relocation method and system
CN112489140B (en) Attitude measurement method
CN113156407B (en) Vehicle-mounted laser radar external parameter joint calibration method, system, medium and device
CN112083403B (en) Positioning tracking error correction method and system for virtual scene
CN111123242A (en) Combined calibration method based on laser radar and camera and computer readable storage medium
Sveier et al. Object detection in point clouds using conformal geometric algebra
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN114638891A (en) Target detection positioning method and system based on image and point cloud fusion
CN112288813B (en) Pose estimation method based on multi-view vision measurement and laser point cloud map matching
WO2021193672A1 (en) Three-dimensional model generation method and three-dimensional model generation device
CN117197245A (en) Pose restoration method and device
CN112446952B (en) Three-dimensional point cloud normal vector generation method and device, electronic equipment and storage medium
JP6584139B2 (en) Information processing apparatus, information processing method, and program
CN113706505A (en) Cylinder fitting method and device for removing local outliers in depth image
CN113793379A (en) Camera pose solving method, system, equipment and computer readable storage medium
CN116205788B (en) Three-dimensional feature map acquisition method, image processing method and related device
Le et al. Geometry-Based 3D Object Fitting and Localizing in Grasping Aid for Visually Impaired
Karami et al. Camera Arrangement in Visual 3D Systems using Iso-disparity Model to Enhance Depth Estimation Accuracy
CN115222799B (en) Method and device for acquiring image gravity direction, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Hikvision Robot Co.,Ltd.

Address before: 310052 5 / F, building 1, building 2, no.700 Dongliu Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: HANGZHOU HIKROBOT TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information