CN113256611A - RGB-D registration precision testing method and device - Google Patents

RGB-D registration precision testing method and device Download PDF

Info

Publication number
CN113256611A
CN113256611A CN202110682745.0A CN202110682745A CN113256611A CN 113256611 A CN113256611 A CN 113256611A CN 202110682745 A CN202110682745 A CN 202110682745A CN 113256611 A CN113256611 A CN 113256611A
Authority
CN
China
Prior art keywords
registration
rgb
hole
color
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110682745.0A
Other languages
Chinese (zh)
Other versions
CN113256611B (en
Inventor
李柯蒙
杨金峰
张合勇
王蓉
罗义鸣
方俊龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Guangpo Intelligent Technology Co ltd
Original Assignee
Zhejiang Guangpo Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Guangpo Intelligent Technology Co ltd filed Critical Zhejiang Guangpo Intelligent Technology Co ltd
Priority to CN202110682745.0A priority Critical patent/CN113256611B/en
Publication of CN113256611A publication Critical patent/CN113256611A/en
Application granted granted Critical
Publication of CN113256611B publication Critical patent/CN113256611B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

The invention discloses a method and a device for testing RGB-D registration accuracy, which solve the problems of the prior art that registration, distortion removal operation and complicated test data cause the waste of computing resources and high test difficulty, and comprise the steps of obtaining a depth image after registration and a color image after registration of a test card for a plane hole of a shooting object; extracting holes on the depth image and the color image; fitting according to the structural characteristics of the hole to obtain hole characteristic points, and extracting the center or vertex of the outline as coordinates of the characteristic points; and calculating the difference of the coordinates of the depth map and the color map on each hole characteristic point, and calculating the current registration value, wherein the current registration value is inversely proportional to the registration precision. The invention adopts a mode of directly extracting the characteristic points on the registered depth map, is more direct and more effective, outputs the pixel difference in the horizontal direction and the vertical direction, has more intuitive description on the precision, and simplifies the calculation process and the data amount required to be processed.

Description

RGB-D registration precision testing method and device
Technical Field
The invention relates to the technical field of image data processing, in particular to a method and equipment for testing RGB-D registration accuracy.
Background
The RGB-D registration is a common function in the application of an RGB-D camera, the registration accuracy is an important index for evaluating the performance of the RGB-D camera, and the application development effect based on the RGB-D camera is directly determined, for example, in the process of three-dimensional reconstruction of a human face, if the registration accuracy is low, the color and the texture of the human face cannot be attached to the correct three-dimensional position, and if the color of eyes is attached to the position of a mouth, the whole human face reconstruction fails, and similar applications also comprise object detection and identification, positioning, drawing and the like. The registration accuracy is determined by two aspects, namely the distance measurement accuracy of the Depth camera, and the calibration accuracy of the RGB camera and the Depth camera (binocular camera) in the RGB-D camera, and the higher the price of the general high accuracy is. Therefore, there is a need for an accuracy testing method for evaluating the performance of a camera on the one hand and as a guide for the developer in the choice of camera on the other hand. If the three-dimensional reconstruction has high requirement on the registration accuracy and the object detection has low requirement on the accuracy, an application developer can select a camera with the highest cost performance through the numerical result of the registration accuracy. Meanwhile, the binocular camera calibration algorithm of the RGB camera and the Depth camera also needs to test the performance thereof through the registration accuracy. However, the existing registration test method cannot directly obtain the matching feature points, and images of other signals are required to be acquired besides color and depth maps required by registration calculation; in the testing process, extra registration calculation is needed, distortion removal and coordinate mapping are needed to be carried out by using internal parameters and external parameters of a camera, and the acquisition and calculation processes are complicated.
For example, a chinese patent document discloses "a method and apparatus for detecting image alignment accuracy, a device, a storage medium", having publication No. CN110378971A, the method comprising: acquiring a shooting result of a tested device on a test pattern in the testing device, wherein the shooting result comprises: the image processing method comprises the following steps of (1) obtaining an infrared IR image, a color RGB image and a depth image, wherein the test pattern is provided with preset characteristic points; determining the three-dimensional coordinates of the feature points in the IR image mapped to the RGB pixel coordinate system according to the feature points in the IR image and the depth image; and determining the image alignment precision of the detected equipment according to the coordinates of the characteristic points in the RGB image and the three-dimensional coordinates of the characteristic points in the IR image which are mapped to the RGB pixel coordinate system. The registration error is defined as the difference of three-dimensional coordinates of matching points of the depth map and the color map, the acquisition of the matching points has a direct method, namely, feature points are directly extracted on the depth map, and an indirect method is also provided, and the method belongs to an indirect method. The method needs to acquire an IR image, only part of the RGB-D cameras have the IR image and provide the IR image for a user, and a tester may not acquire the IR image; acquiring internal parameters with distortion parameters of the color camera and the depth camera and external parameters of the two cameras, wherein only part of the RGB-D cameras have the internal parameters with the distortion parameters and the external parameters, and the internal parameters and the external parameters are provided for users and can not be acquired by testers; image processing calculation such as distortion removal and coordinate mapping is needed, images provided by an RGB-D camera are generally subjected to distortion removal, distorted images are difficult to obtain, calculation is complex, testing personnel can not realize the calculation, and meanwhile coordinate mapping is registration operation, so that the conventional method is repeated with a test object in the main calculation such as distortion removal and coordinate mapping, calculation resources are seriously wasted, and the testing difficulty is improved; a backlight plate capable of emitting visible light or infrared light is required for illumination, so that the acquisition difficulty of detection equipment is increased; the precision is defined as three-dimensional coordinate difference, and a test engineer pays more attention to the registration precision by two-dimensional pixel difference in the horizontal direction and the vertical direction, so that the three-dimensional coordinate difference is not visual and practical.
Disclosure of Invention
The invention aims to solve the problems of the registration, distortion removal operation and complicated test data in the prior art, which cause the waste of computing resources and high test difficulty, and provides an RGB-D registration precision test method and equipment.
In order to achieve the purpose, the invention adopts the following technical scheme:
a RGB-D registration accuracy testing method is characterized by comprising the following steps:
s1, acquiring a depth map and a color map after registration of the test card for the planar holes of the shot objects;
s2, extracting holes on the depth map and the color map image;
s3, fitting according to the structural characteristics of the hole to obtain hole characteristic points, and extracting coordinates of the center or the top point of the outline as coordinates of the characteristic points;
and S4, calculating the difference of coordinates of the depth map and the color map on each hole characteristic point, and calculating the current registration value, wherein the current registration value is inversely proportional to the registration accuracy.
The invention directly extracts the characteristic points on the registered depth map, is different from the prior art which indirectly acquires the characteristic points corresponding to the depth map and the color map, is more direct, is more effective, simultaneously outputs the pixel difference in the horizontal direction and the vertical direction, has more visual description on the precision, and simplifies the calculation process and the data amount required to be processed.
Preferably, the S1 acquiring includes the following steps:
acquiring on line, opening a registration function of the RGB-D camera, and shooting a test card;
and (4) acquiring off-line, and registering outside the RGB-D camera after acquiring the depth map and the color map.
The invention can adopt on-line acquisition or off-line acquisition.
Preferably, the S2 includes the following steps:
s21, holes are extracted from the depth map, the shooting objects of the holes are background plates, the depth values are larger than the test card, and the holes are cut through the depth difference;
and S22, holes are extracted from the color picture, the shooting objects of the holes are background plates, the colors of the background plates are different from those of the test card, and the holes are cut through color difference.
The holes can be cut through the depth difference of the depth value of the background plate which is larger than the depth value of the test card on the depth map because the shooting object at the holes is the background plate, the color image is similar, the color of the background plate is greatly different from that of the test card, and the holes can be cut through the color difference.
Preferably, in S21, the holes are cut by the following method: setting the pixel value larger than the depth of the test card to be 1, otherwise setting the pixel value to be 0, and obtaining a hole binary image;
the S22 adopts the following method to divide holes: setting the pixel value different from the color of the test card to be 1, otherwise setting the pixel value to be 0, and obtaining a hole binary image.
Preferably, the S3 includes the following steps:
s31, determining a standard equation according to the shape of the hole, and fitting the hole profile by adopting a least square method;
s32, judging whether the resolutions of the color image and the depth image are the same, if so, reducing the large-resolution image or amplifying the small-resolution image through an image scaling algorithm to obtain the color image and the depth image with the same resolution, and if so, carrying out the next step;
s33, respectively acquiring hole characteristic point coordinates (Yd, Xd) on the depth map and hole characteristic point coordinates (Yc, Xc) on the color map;
s34, defining the coordinates of the pixel of each hole characteristic point row and column, and defining the coordinates of the pixel of each hole characteristic point row and column as Pc(i,j)=(Yc(i,j),Xc(i,j)) The pixel coordinate of each hole characteristic point row and column of the depth map is defined as Pd(i,j)=(Yd(i,j),Xd(i,j));
Wherein (i, j) represents the jth feature point of the ith hole.
After the hole image is obtained, the center or the vertex of the hole can be directly taken as a characteristic point, and burrs caused by noise at the edge of the hole can be removed by a fitting method to obtain more accurate characteristic points; the fitting method is to fit the outline of the hole according to the structure of the hole, fit the circular hole by an ellipse and take the center as a characteristic point, fit the polygonal hole by a straight line and take the intersection point of the straight lines, namely a polygonal fixed point, as the characteristic point.
Preferably, the image scaling algorithm in S32 includes bilinear interpolation, trilinear interpolation, and downsampling.
Preferably, the S4 includes the following steps:
s41, defining the difference in horizontal direction as column coordinate difference abs (X)d(i,j)-Xc(i,j)) The difference in the vertical direction is the row coordinate difference abs (Y)d(i,j)-Yc(i,j));
And S42, counting the mean value, the variance, the standard deviation, the maximum value and the minimum value of the coordinate difference among the characteristic points as the current registration value, wherein the larger the current registration value is, the lower the registration precision is.
The RGB-D registration precision testing equipment comprises a testing card, a background plate and a camera, wherein the camera, the testing card and the background plate are sequentially arranged, the shooting direction of the camera is perpendicular to the testing card, the testing card is parallel to the background plate, the testing card is coated with colors, the colors are visible for a color camera and a depth camera, and the testing card is a plane with diffuse reflection characteristics.
Preferably, the test card is provided with uniformly distributed holes.
The test card is a plane, holes with regular shapes are uniformly dug in the plane, the holes can be circular or polygonal, the surface of the test card facing a camera has diffuse reflection characteristics, the camera can see the test card, the shooting direction of the camera points to the test card, a background plate is arranged behind the test card, the distances among the test card, the background plate and the camera are all within the measurement range of the depth camera, and the distance between the background plate and the test card is greater than the minimum measurement depth of the depth camera; the background plate can be a building wall body, can be seen by a color camera, and has obvious difference with the color of the test card on a color image.
Therefore, the invention has the following beneficial effects:
1. according to the method, the matched feature points are directly obtained through the detection equipment, and images of other signals are not required to be acquired except for color and depth images which are used and output in a registering manner; no additional registration calculation is needed in the testing process except for the registration algorithm of the camera or the test generation; distortion removal and coordinate mapping of camera internal reference and external reference are not needed;
2. the invention fully utilizes the operation result of the test object, has simple calculation and is easy to realize. And the detection equipment is simple, does not need an additional light source, and is low in cost and easy to construct.
Drawings
FIG. 1 is a flow chart of the present invention.
Fig. 2 is a flowchart of registration accuracy calculation in embodiment 1.
Fig. 3 is a schematic diagram of the structure of the apparatus of the present invention.
FIG. 4 is a schematic diagram of the structure of the test card of the present invention.
Fig. 5 is a schematic diagram of depth data acquired by the camera in embodiment 2.
Fig. 6 is an image display of depth data acquired by the camera according to embodiment 2.
FIG. 7 is a graph showing data for extracting holes according to the 1m threshold in example 2.
FIG. 8 is a diagram of the binary hole extraction according to the 1m threshold in example 2.
FIG. 9 is an image of depth map after feature points are extracted by straight line fitting in example 2.
Fig. 10 is a schematic diagram of color data R channel data acquired by the camera according to embodiment 2.
Fig. 11 is a schematic diagram of color data G channel data acquired by the camera according to embodiment 2.
Fig. 12 is a schematic diagram of color data B channel data acquired by the camera according to embodiment 2.
Fig. 13 is a pictorial display of color data acquired by the camera according to embodiment 2.
FIG. 14 is a graph of data for white extraction holes as per example 2.
FIG. 15 is a binary image of white extraction holes according to example 2.
FIG. 16 is the image after feature points are extracted by straight line fitting of the color image in example 2.
In the figure: 1. test card 101, hole 2, background plate 3, camera.
Detailed Description
The invention is further described with reference to the following detailed description and accompanying drawings.
Example 1:
the embodiment provides a method for testing RGB-D registration accuracy, as shown in fig. 1 and 2, including the following steps:
s1, acquiring a depth map and a color map after registration of the test card for the planar holes of the shot objects;
the S1 acquiring includes the following steps:
acquiring on line, opening a registration function of the RGB-D camera, and shooting a test card;
and (4) acquiring off-line, and registering outside the RGB-D camera after acquiring the depth map and the color map.
S2, extracting holes on the depth map and the color map image;
the S2 includes the steps of:
s21, holes are extracted from the depth map, the shooting objects of the holes are background plates, the depth values are larger than the test card, and the holes are cut through the depth difference;
in S21, holes are cut by the following method: setting the pixel value larger than the depth of the test card to be 1, otherwise setting the pixel value to be 0, and obtaining a hole binary image;
s22, holes are extracted from the color picture, the shooting objects of the holes are background plates, the colors of the background plates are different from those of the test card, and the holes are cut through color difference;
s22 holes are cut by the following method: setting the pixel value different from the color of the test card to be 1, otherwise setting the pixel value to be 0, and obtaining a hole binary image.
S3, fitting according to the structural characteristics of the hole to obtain hole characteristic points, and extracting coordinates of the center or the top point of the outline as coordinates of the characteristic points;
the S3 includes the steps of:
s31, determining a standard equation according to the shape of the hole, and fitting the hole profile by adopting a least square method;
s32, judging whether the resolutions of the color image and the depth image are the same, if so, reducing the large-resolution image or amplifying the small-resolution image through an image scaling algorithm to obtain the color image and the depth image with the same resolution, and if so, carrying out the next step;
the image scaling algorithm comprises bilinear interpolation, trilinear interpolation and downsampling;
s33, respectively acquiring hole characteristic point coordinates (Yd, Xd) on the depth map and hole characteristic point coordinates (Yc, Xc) on the color map;
s34, defining the coordinates of the pixel of each hole characteristic point row and column, and defining the coordinates of the pixel of each hole characteristic point row and column as Pc(i,j)=(Yc(i,j),Xc(i,j)) The pixel coordinate of each hole characteristic point row and column of the depth map is defined as Pd(i,j)=(Yd(i,j),Xd(i,j));
Wherein (i, j) represents the jth feature point of the ith hole.
S4, calculating the difference of coordinates of the depth map and the color map on each hole feature point, and calculating the current registration value, wherein the current registration value is inversely proportional to the registration accuracy;
the S4 includes the steps of:
s41, defining the difference in horizontal direction as column coordinate difference abs (X)d(i,j)-Xc(i,j)) The difference in the vertical direction is the row coordinate difference abs (Y)d(i,j)-Yc(i,j));
And S42, counting the mean value, the variance, the standard deviation, the maximum value and the minimum value of the coordinate difference among the characteristic points as the current registration value, wherein the larger the current registration value is, the lower the registration precision is.
The embodiment also correspondingly provides an RGB-D registration accuracy testing device, which adopts an RGB-D registration accuracy testing method, as shown in fig. 3 and 4, the RGB-D registration accuracy testing device includes a test card 1, a background plate 2 and a camera 3, the test card 1 and the background plate 2 are sequentially arranged, a shooting direction of the camera 3 is perpendicular to the test card 1, holes 101 are uniformly distributed on the test card 1, the test card 1 is parallel to the background plate 2, the test card 1 is coated with a color which is visible to both a color camera and a depth camera, the test card 1 is a plane with a diffuse reflection characteristic, distances between the test card 1, the background plate 2 and the camera 3 are within a measuring range of the depth camera, and a distance between the background plate 2 and the test card 1 is greater than a minimum measuring depth of the depth camera; the background plate can be a building wall body, can be seen by a color camera, and has obvious difference with the color of the test card on a color image.
Example 2:
the embodiment provides a method for testing RGB-D registration accuracy, and the rest steps are the same as those in embodiment 1; in this embodiment, a rectangular hole, a green test card and a white background plate are used as the detection device, and the data of the calculation process is as follows:
as shown in fig. 5 and 6, the depth data collected by the camera at a distance of 1m from the test card and 1.2m from the background plate is in unit millimeter, noise exists in fig. 5, and data of '1100' appears; FIG. 6 is a pictorial display of depth data acquired by the camera of embodiment 2;
FIG. 7 and FIG. 8 show the data and graphical display of the binary image of the extracted holes according to the 1m threshold; FIG. 7 is a data diagram; fig. 8 is a binary image.
As shown in fig. 9, the depth map is linearly fitted and feature points are extracted, the feature points in this embodiment are the intersection points of the straight lines,
the straight lines fitted on the upper, lower, left and right sides of the left hole are respectively:
the method comprises the following steps: y =2
The following: y =4
Left: x =3
And (3) right: x =5
The intersection points are:
upper left: pd(1,1)=(2,3)
Upper right: pd(1,2)= (2,5)
Left lower: pd(1,3)= (4,3)
Right lower: pd(1,4)= (4,5)
Similarly, the feature points extracted from the right side hole are:
Pd(2,1)=(4,9)、Pd(2,2)=(4,11)、Pd(2,3)=(6,9)、Pd(2,4)=(6,11)。
the graphical display includes fitted straight lines, "+" is a feature point.
As shown in fig. 10 to 12, the color map is divided into RGB three-channel data, and R-channel, G-channel, and B-channel data are displayed, respectively.
As shown in fig. 13, a color image is displayed.
As shown in fig. 14 and 15, fig. 14 is a data schematic diagram, and according to the white extraction hole binary diagram, data of 255 for three channels are all 1, otherwise, data of 0 is obtained; fig. 15 is a binary diagram.
As shown in FIG. 16, the color map is fitted straight and the feature points, P, are extractedc(1,1)=(3,3)、Pc(1,2)=(3,5)、Pc(1,3)=(5,3)、Pc(1,4)=(5,5)、Pc(2,1)=(3,8)、Pc(2,2)=(3,10)、Pc(2,3)=(5,8)、Pc(2,4)=(5,10)。
The graphical display includes fitted straight lines, "+" is a feature point.
Then the registration accuracy is calculated:
registration value d in vertical directionY(i,j)=abs(Yd(i,j)-Yc(i,j))={abs(2-3),abs(2-3),abs(4-5),abs(4-5),abs(4-3),abs(4-3), abs(6-5), abs(6-5)}={1,1,1,1,1,1,1,1}
Registration value d in horizontal directionX(i,j)=abs(Xd(i,j)-Xc(i,j))={abs(3-3),abs(5-5),abs(3-3),abs(5-5),abs(9-8), abs(11-10),abs(9-8),abs(11-10)}={0,0,0,0,1,1,1,1}
Mean value:
mean(dY(i,j)) = (1+1+1+1+1+1+1+1)/8=1
mean(dX(i,j)) = (0+0+0+0+1+1+1+1)/8=0.5
standard deviation:
std(dY(i,j)) =sqrt(((1-1)2+(1-1)2+(1-1)2+(1-1)2+(1-1)2+(1-1)2+(1-1)2+(1-1)2)/8)=0
std(dX(i,j)) =sqrt(((0-1)2+(0-1)2+(0-1)2+(0-1)2+(1-1)2+(1-1)2+(1-1)2+(1-1)2)/8)= 0.7071
maximum value:
Max(dY(i,j))=1
Max(dX(i,j))=1
minimum value:
Min(dY(i,j))=1
Min(dX(i,j))=0
in the invention, the registration accuracy is lower when the registration value is larger, and conversely, the registration accuracy is higher when the registration value is smaller.
The above embodiments are described in detail for the purpose of further illustrating the present invention and should not be construed as limiting the scope of the present invention, and the skilled engineer can make insubstantial modifications and variations of the present invention based on the above disclosure.

Claims (9)

1. A RGB-D registration accuracy testing method is characterized by comprising the following steps:
s1, acquiring a depth map and a color map after registration of the test card for the planar holes of the shot objects;
s2, extracting holes on the depth map and the color map image;
s3, fitting according to the structural characteristics of the hole to obtain hole characteristic points, and extracting coordinates of the center or the top point of the outline as coordinates of the characteristic points;
and S4, calculating the difference of coordinates of the depth map and the color map on each hole characteristic point, and calculating the current registration value, wherein the current registration value is inversely proportional to the registration accuracy.
2. The RGB-D registration accuracy testing method as claimed in claim 1, wherein the step of obtaining S1 includes the following steps:
acquiring on line, opening a registration function of the RGB-D camera, and shooting a test card;
and (4) acquiring off-line, and registering outside the RGB-D camera after acquiring the depth map and the color map.
3. The RGB-D registration accuracy testing method as claimed in claim 1, wherein the S2 comprises the steps of:
s21, holes are extracted from the depth map, the shooting objects of the holes are background plates, the depth values are larger than the test card, and the holes are cut through the depth difference;
and S22, holes are extracted from the color picture, the shooting objects of the holes are background plates, the colors of the background plates are different from those of the test card, and the holes are cut through color difference.
4. The RGB-D registration accuracy testing method of claim 3, wherein the holes are segmented in S21 by the following method: setting the pixel value larger than the depth of the test card to be 1, otherwise setting the pixel value to be 0, and obtaining a hole binary image;
the S22 adopts the following method to divide holes: setting the pixel value different from the color of the test card to be 1, otherwise setting the pixel value to be 0, and obtaining a hole binary image.
5. The RGB-D registration accuracy testing method as claimed in claim 1, wherein the S3 comprises the steps of:
s31, determining a standard equation according to the shape of the hole, and fitting the hole profile by adopting a least square method;
s32, judging whether the resolutions of the color image and the depth image are the same, if so, reducing the large-resolution image or amplifying the small-resolution image through an image scaling algorithm to obtain the color image and the depth image with the same resolution, and if so, carrying out the next step;
s33, respectively acquiring hole characteristic point coordinates (Yd, Xd) on the depth map and hole characteristic point coordinates (Yc, Xc) on the color map;
s34, defining the coordinates of the pixel of each hole characteristic point row and column, and defining the coordinates of the pixel of each hole characteristic point row and column as Pc(i,j)=(Yc(i,j),Xc(i,j)) The pixel coordinate of each hole characteristic point row and column of the depth map is defined as Pd(i,j)=(Yd(i,j),Xd(i,j));
Wherein (i, j) represents the jth feature point of the ith hole.
6. The RGB-D registration accuracy testing method of claim 5, wherein the image scaling algorithm in S32 includes bilinear interpolation, trilinear interpolation and downsampling.
7. The RGB-D registration accuracy testing method as claimed in claim 1, wherein the S4 comprises the steps of:
s41, defining the difference in horizontal direction as column coordinate difference abs (X)d(i,j)-Xc(i,j)) The difference in the vertical direction is the row coordinate difference abs (Y)d(i,j)-Yc(i,j));
And S42, counting the mean value, the variance, the standard deviation, the maximum value and the minimum value of the coordinate difference among the characteristic points as the current registration value, wherein the larger the current registration value is, the lower the registration precision is.
8. An RGB-D registration accuracy testing device adopts the RGB-D registration accuracy testing method as claimed in any one of claims 1 to 7, and is characterized by comprising a testing card (1), a background plate (2) and a camera (3), wherein the camera (3), the testing card (1) and the background plate (2) are sequentially arranged, the shooting direction of the camera (3) is perpendicular to the testing card (1), the testing card (1) is parallel to the background plate (2), the testing card (1) is coated with colors, the colors are visible for both a color camera and a depth camera, and the testing card (1) is a plane with diffuse reflection characteristics.
9. The RGB-D registration accuracy test apparatus of claim 8, wherein the test card (1) is provided with evenly distributed holes (101).
CN202110682745.0A 2021-06-21 2021-06-21 RGB-D registration precision testing method and device Active CN113256611B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110682745.0A CN113256611B (en) 2021-06-21 2021-06-21 RGB-D registration precision testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110682745.0A CN113256611B (en) 2021-06-21 2021-06-21 RGB-D registration precision testing method and device

Publications (2)

Publication Number Publication Date
CN113256611A true CN113256611A (en) 2021-08-13
CN113256611B CN113256611B (en) 2021-12-24

Family

ID=77188832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110682745.0A Active CN113256611B (en) 2021-06-21 2021-06-21 RGB-D registration precision testing method and device

Country Status (1)

Country Link
CN (1) CN113256611B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439543A (en) * 2022-09-02 2022-12-06 北京百度网讯科技有限公司 Method for determining hole position and method for generating three-dimensional model in metauniverse
CN115797426A (en) * 2023-02-13 2023-03-14 合肥的卢深视科技有限公司 Image alignment method, electronic device and storage medium
CN116777910A (en) * 2023-08-18 2023-09-19 武汉精立电子技术有限公司 Display screen sub-pixel brightness extraction precision evaluation method and system and electronic equipment

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750711A (en) * 2012-06-04 2012-10-24 清华大学 Binocular video depth map obtaining method based on image segmentation and motion estimation
CN206077542U (en) * 2016-08-31 2017-04-05 北京的卢深视科技有限公司 Obtain the security protection network cameras of scene three-dimensional information
CN106780474A (en) * 2016-12-28 2017-05-31 浙江工业大学 A kind of registering and optimization method of the real-time deep figure based on Kinect and coloured picture
CN107507235A (en) * 2017-08-31 2017-12-22 山东大学 A kind of method for registering of coloured image and depth image based on the collection of RGB D equipment
CN107767456A (en) * 2017-09-22 2018-03-06 福州大学 A kind of object dimensional method for reconstructing based on RGB D cameras
CN108399610A (en) * 2018-03-20 2018-08-14 上海应用技术大学 A kind of depth image enhancement method of fusion RGB image information
CN109978929A (en) * 2017-12-28 2019-07-05 舜宇光学(浙江)研究院有限公司 The RGB-D image synthesis optimizing system and method for depth information camera module
CN110378971A (en) * 2019-07-25 2019-10-25 Oppo广东移动通信有限公司 A kind of detection method and device of image alignment precision, equipment, storage medium
CN110942476A (en) * 2019-10-17 2020-03-31 湖南大学 Improved three-dimensional point cloud registration method and system based on two-dimensional image guidance and readable storage medium
CN111080640A (en) * 2019-12-30 2020-04-28 广东博智林机器人有限公司 Hole detection method, device, equipment and medium
CN111524181A (en) * 2020-04-28 2020-08-11 陕西科技大学 Automatic measurement method for porous material holes based on scanning electron microscope image segmentation
CN112164117A (en) * 2020-09-30 2021-01-01 武汉科技大学 V-SLAM pose estimation method based on Kinect camera

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750711A (en) * 2012-06-04 2012-10-24 清华大学 Binocular video depth map obtaining method based on image segmentation and motion estimation
CN206077542U (en) * 2016-08-31 2017-04-05 北京的卢深视科技有限公司 Obtain the security protection network cameras of scene three-dimensional information
CN106780474A (en) * 2016-12-28 2017-05-31 浙江工业大学 A kind of registering and optimization method of the real-time deep figure based on Kinect and coloured picture
CN107507235A (en) * 2017-08-31 2017-12-22 山东大学 A kind of method for registering of coloured image and depth image based on the collection of RGB D equipment
CN107767456A (en) * 2017-09-22 2018-03-06 福州大学 A kind of object dimensional method for reconstructing based on RGB D cameras
CN109978929A (en) * 2017-12-28 2019-07-05 舜宇光学(浙江)研究院有限公司 The RGB-D image synthesis optimizing system and method for depth information camera module
CN108399610A (en) * 2018-03-20 2018-08-14 上海应用技术大学 A kind of depth image enhancement method of fusion RGB image information
CN110378971A (en) * 2019-07-25 2019-10-25 Oppo广东移动通信有限公司 A kind of detection method and device of image alignment precision, equipment, storage medium
CN110942476A (en) * 2019-10-17 2020-03-31 湖南大学 Improved three-dimensional point cloud registration method and system based on two-dimensional image guidance and readable storage medium
CN111080640A (en) * 2019-12-30 2020-04-28 广东博智林机器人有限公司 Hole detection method, device, equipment and medium
CN111524181A (en) * 2020-04-28 2020-08-11 陕西科技大学 Automatic measurement method for porous material holes based on scanning electron microscope image segmentation
CN112164117A (en) * 2020-09-30 2021-01-01 武汉科技大学 V-SLAM pose estimation method based on Kinect camera

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CANBEN YIN等: "Removing dynamic 3D objects from point clouds of a moving RGB-D camera", 《2015 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION》 *
HAO ZHANG等: "CoDe4D: Color-Depth Local Spatio-Temporal Features for Human Activity Recognition From RGB-D Videos", 《IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY》 *
宋希彬: "基于RGB-D信息的深度图像增强关键技术研究", 《中国博士学位论文全文数据库 信息科技辑》 *
谭雅斯: "基于 RGB-D相机的点云拼接三维重", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439543A (en) * 2022-09-02 2022-12-06 北京百度网讯科技有限公司 Method for determining hole position and method for generating three-dimensional model in metauniverse
CN115439543B (en) * 2022-09-02 2023-11-10 北京百度网讯科技有限公司 Method for determining hole position and method for generating three-dimensional model in meta universe
CN115797426A (en) * 2023-02-13 2023-03-14 合肥的卢深视科技有限公司 Image alignment method, electronic device and storage medium
CN115797426B (en) * 2023-02-13 2023-05-12 合肥的卢深视科技有限公司 Image alignment method, electronic device and storage medium
CN116777910A (en) * 2023-08-18 2023-09-19 武汉精立电子技术有限公司 Display screen sub-pixel brightness extraction precision evaluation method and system and electronic equipment
CN116777910B (en) * 2023-08-18 2023-11-28 武汉精立电子技术有限公司 Display screen sub-pixel brightness extraction precision evaluation method and system and electronic equipment

Also Published As

Publication number Publication date
CN113256611B (en) 2021-12-24

Similar Documents

Publication Publication Date Title
CN113256611B (en) RGB-D registration precision testing method and device
CN106651752B (en) Three-dimensional point cloud data registration method and splicing method
US8988317B1 (en) Depth determination for light field images
CN112818988B (en) Automatic identification reading method and system for pointer instrument
US10424078B2 (en) Height measuring system and method
WO2019080229A1 (en) Chess piece positioning method and system based on machine vision, storage medium, and robot
CN103345755B (en) A kind of Chessboard angular point sub-pixel extraction based on Harris operator
CN109544628B (en) Accurate reading identification system and method for pointer instrument
CN102737370B (en) Method and device for detecting image foreground
CN109215063A (en) A kind of method for registering of event triggering camera and three-dimensional laser radar
CN107924461A (en) For multifactor characteristics of image registration and method, circuit, equipment, system and the correlation computer executable code of tracking
CN101751672A (en) Image processing system
CN104173054A (en) Measuring method and measuring device for height of human body based on binocular vision technique
CN101673397A (en) Digital camera nonlinear calibration method based on LCDs
CN109035292A (en) Moving target detecting method and device based on deep learning
CN107084680A (en) A kind of target depth measuring method based on machine monocular vision
CN110975270A (en) Standing long jump detection method based on marks and computer vision
CN107560541A (en) The measuring method and device of picture centre deviation
CN106709432A (en) Binocular stereoscopic vision based head detecting and counting method
CN113749646A (en) Monocular vision-based human body height measuring method and device and electronic equipment
US11620764B1 (en) Systems and methods for generating point-accurate three-dimensional models with point-accurate color information from a non-cosited capture
CN110288654A (en) A kind of method that the geometry of single image measures
CN114862960A (en) Multi-camera calibrated image ground leveling method and device, electronic equipment and medium
TWI595446B (en) Method for improving occluded edge quality in augmented reality based on depth camera
WO2017202191A1 (en) Facial data measurement method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant