CN116416283A - Rapid registration method and device for RGB-D camera module and RGB-D camera module - Google Patents

Rapid registration method and device for RGB-D camera module and RGB-D camera module Download PDF

Info

Publication number
CN116416283A
CN116416283A CN202111641403.0A CN202111641403A CN116416283A CN 116416283 A CN116416283 A CN 116416283A CN 202111641403 A CN202111641403 A CN 202111641403A CN 116416283 A CN116416283 A CN 116416283A
Authority
CN
China
Prior art keywords
rgb
camera
image
depth
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111641403.0A
Other languages
Chinese (zh)
Inventor
孙佳睿
李楠
李健
金瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Original Assignee
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optical Intelligent Technology Co Ltd filed Critical Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority to CN202111641403.0A priority Critical patent/CN116416283A/en
Publication of CN116416283A publication Critical patent/CN116416283A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The quick registration method for the RGB-D camera module is used for registering local or specific points of an image in a clipping-enlarging mode to simplify the registration process and reduce the data processing amount so as to improve the registration efficiency. In addition, in the registration scheme, since the part of the RGB image, which does not originally correspond to the depth region, does not participate in the subsequent registration process, the registration can be completed on the premise that at least a part of the RGB image does not correspond to the depth region.

Description

Rapid registration method and device for RGB-D camera module and RGB-D camera module
Technical Field
The present application relates to the field of RGB-D camera modules, and more particularly to a fast registration method and apparatus for an RGB-D camera module and an RGB-D camera module.
Background
The RGB-D image pickup module is an image pickup module capable of outputting not only RGB information of a subject but also depth information of the subject. With the continuous development of machine vision, the RGB-D camera module is increasingly widely applied to the fields of automatic driving, vision obstacle avoidance, face recognition and the like.
In practical applications, not only RGB information and depth information of a subject are generally acquired, but also the depth information and the RGB information need to be registered. In the conventional registration method, RGB-D registration is mainly achieved by one-to-one correspondence between pixels of a depth image and pixels of an RGB image (i.e., point-by-point correspondence), or by interpolation between pixels of a depth image and pixels of an RGB image, in a one-to-many manner.
However, the conventional registration method has some problems, for example, the calculation amount of the point-by-point calculation of the correspondence relation is large, which results in long time consumption; for another example, the RGB-D registration may be performed on the premise that the entire RGB image has a corresponding depth region, otherwise, the registered image may be defective.
Therefore, an optimized registration method for an RGB-D camera module is desired to improve registration efficiency.
Disclosure of Invention
An advantage of the present application is that a fast registration method and apparatus for an RGB-D camera module and an RGB-D camera module are provided, where the registration method for an RGB-D camera module can achieve registration only by determining positions of specific points of an infrared image and an RGB image, so that data processing capacity is reduced to a greater extent, and registration efficiency is improved.
Another advantage of the present application is that a quick registration method, apparatus, and RGB-D camera module for an RGB-D camera module are provided, where the registration method for an RGB-D camera module performs registration by clipping-enlarging, and a portion of an RGB image that does not originally have a corresponding depth region does not participate in a subsequent registration process, so that registration may be completed on the premise that at least a portion of the RGB image does not have a corresponding depth region.
Still another advantage of the present application is to provide a fast registration method, apparatus, and RGB-D camera module for an RGB-D camera module, in which an RGB image is cropped around a specific reference point thereof and in a cropping size consistent with an aspect ratio of an infrared image, so that registration can be achieved by achieving correspondence between a first reference point in the infrared image and a second reference point in the RGB image, and correspondence between at least one corner point of the infrared image and at least one corner point of an enlarged cropping area, and registration process can be greatly simplified and registration efficiency can be improved.
In another aspect of the present application, a method and an apparatus for fast registration of an RGB-D camera module are provided, and an RGB-D camera module is provided, where in the method for registration of an RGB-D camera module, a first reference point of an infrared image is determined through a center point of a point cloud area, so that a pixel point with a reference point being a flying point or without depth information can be effectively avoided.
To achieve at least one of the above or other advantages and objects, according to one aspect of the present application, there is provided a registration method for an RGB-D camera module, including:
obtaining calibration parameters of an RGB-D camera module;
the RGB image and source data of a shot target are respectively obtained through an RGB camera and a depth camera of the RGB-D shooting module, wherein the view field of the RGB camera comprises the view field of the depth camera;
processing the source data based on the calibration parameters to obtain a depth point cloud and an infrared image, wherein the depth point cloud is aligned with the infrared image;
calculating a focal length ratio between an effective focal length of the depth camera and an effective focal length of the RGB camera;
determining the size of a region to be cut corresponding to the infrared image in the RGB image based on the size of the infrared image and the focal length ratio;
calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image;
determining the position of the region to be cropped in the RGB image based on the offset; and
and amplifying the region to be cut by taking the focal length ratio as a magnification factor to obtain a registration region in the RGB image, which is registered with the infrared image.
In the registration method for the RGB-D camera module, the first reference point is the optical center of the infrared image, and the second reference point is the optical center of the RGB image.
In a registration method for an RGB-D camera module according to the present application, calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image includes: calculating an offset between the first reference point and the second reference point with the following formula: x=x2-x 1/a, y=y2-y 1/a, where (x 1, y 1) represents the optical center coordinates of the infrared image, (x 2, y 2) represents the optical center coordinates of the RGB image, a represents the focal length ratio, x represents the offset in the u direction, and y represents the offset in the v direction.
In the registration method for an RGB-D camera module according to the present application, determining the position of the region to be cropped in the RGB image based on the offset includes: taking the offset as the coordinate of one corner point of the region to be cut in the RGB image; and determining the position of the region to be cut in the RGB image based on the coordinates of the corner points and the size of the region to be cut.
In a registration method for an RGB-D camera module according to the present application, calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image includes: extracting a point cloud area with a preset size of the depth point cloud in the central area of the depth point cloud; counting the depth value of each pixel point in the point cloud area to obtain a central point of the point cloud area, and further obtaining a central point of an effective area in the central area of the infrared image as the first reference point, wherein the central point of the effective area in the central area of the infrared image is aligned with the central point of the point cloud area; projecting the point cloud area to a coordinate system of the RGB image based on the calibration parameters of the RGB-D camera module to obtain a projection area; and counting the values of all pixel points in the projection area to obtain a center point of the projection area as the second reference point.
In the registration method for the RGB-D camera module according to the application, the offset between the first reference point and the second reference point is calculated according to the following formula: x=x2 '-x1'/a, y=y2 '-y1'/a, where (x 1', y 1') denotes coordinates of a center point of an effective area in a center area of the infrared image, (x 2', y 2') denotes coordinates of a center point of a projection area of the RGB image, a denotes the focal length ratio, x denotes an offset in a u direction, and y denotes an offset in a v direction.
In the registration method for an RGB-D camera module according to the present application, determining the position of the region to be cropped in the RGB image based on the offset includes: taking the offset as the coordinate of one corner point of the region to be cut in the RGB image; and determining the position of the region to be cut in the RGB image based on the coordinates of the corner points and the size of the region to be cut.
In the registration method for an RGB-D camera module according to the present application, the shape of the point cloud area is square, and the size thereof is 10×10 to 50×50.
In the registration method for an RGB-D camera module according to the present application, the amplifying the region to be cropped with the focal length ratio as a magnification factor to obtain a registration region in the RGB image registered with the infrared image includes: rotating the region to be cut based on the calibration parameters; and amplifying the region to be cut by taking the focal length proportion as an amplifying factor so as to obtain the registration region.
In the registration method for an RGB-D camera module according to the present application, a difference in field angle between the RGB camera and the depth camera in a horizontal direction is greater than 2 °, and a distance between center points of the RGB camera and the depth camera is 10mm.
According to another aspect of the present application, there is provided an RGB-D camera module, including:
an RGB camera;
a depth camera, a field of view of the RGB camera including a field of view of the depth camera; and
and the data processing device is used for executing the rapid registration method for the RGB-D camera module.
In the RGB-D camera module according to the present application, a difference in angle of view of the RGB camera and the depth camera in a horizontal direction is greater than 2 °.
In the RGB-D camera module according to the present application, a distance between the optical centers of the RGB camera and the depth camera in a horizontal direction is less than 15mm.
According to still another aspect of the present application, there is provided a registration apparatus for an RGB-D camera module, including:
the calibration parameter acquisition unit is used for acquiring calibration parameters of the RGB-D camera module;
a source image obtaining unit, configured to obtain an RGB image and an infrared image of a subject through an RGB camera and a depth camera of the RGB-D camera module, where a field of view of the RGB camera includes a field of view of the depth camera;
the depth calculation unit is used for processing the infrared image based on the calibration parameters to obtain a depth point cloud;
A focal length ratio calculating unit for calculating a focal length ratio between an effective focal length of the depth camera and an effective focal length of the RGB camera;
a size determining unit, configured to determine a size of a region to be cropped corresponding to the infrared image in the RGB image based on the size of the infrared image and the focal length ratio;
an offset amount calculating unit for calculating an offset amount between a first reference point in the infrared image and a second reference point in the RGB image;
a region to be cropped determining unit configured to determine a position of the region to be cropped in the RGB image based on the offset; and
and the registration area generating unit is used for amplifying the area to be cut by taking the focal length ratio as a magnification factor so as to obtain a registration area registered with the infrared image in the RGB image.
According to still another aspect of the present application, there is provided an electronic device, including:
a processor; and
a memory in which computer program instructions are stored which, when executed by the processor, cause the processor to perform the fast registration method for an RGB-D camera module as described above.
Further objects and advantages of the present application will become fully apparent from the following description and the accompanying drawings.
These and other objects, features, and advantages of the present application will become more fully apparent from the following detailed description, the accompanying drawings, and the appended claims.
Drawings
These and/or other aspects and advantages of the present application will become more apparent and more readily appreciated from the following detailed description of the embodiments of the present application, taken in conjunction with the accompanying drawings, wherein:
fig. 1 illustrates a flow diagram of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 2 illustrates a view field range schematic diagram of an RGB camera and a depth camera in a horizontal direction in a registration method for an RGB-D camera module according to an embodiment of the present application.
Fig. 3 illustrates another view field range schematic diagram of an RGB camera and a depth camera in a horizontal direction in a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 4 illustrates a flowchart of one specific example of a registration method for an RGB-D camera module according to an embodiment of the present application.
Fig. 5A illustrates one of process diagrams of one specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 5B illustrates a second process diagram of one specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 5C illustrates a third process diagram of one specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 5D illustrates a fourth process diagram of one specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 6 illustrates a flowchart of calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image in another specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 7 illustrates a flowchart of another specific example of a registration method for an RGB-D camera module according to an embodiment of the present application.
Fig. 8A illustrates one of process diagrams of another specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 8B illustrates a second process diagram of another specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 8C illustrates a third process diagram of another specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 8D illustrates a fourth process diagram of another specific example of a registration method for an RGB-D camera module according to an embodiment of the application.
Fig. 9 illustrates a schematic block diagram of a registration apparatus for an RGB-D camera module according to an embodiment of the application.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the application. The embodiments in the following description are by way of example only and other obvious variations will occur to those skilled in the art. The basic principles of the present application defined in the following description may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the present application.
Summary of the application
As described above, in the conventional registration method, RGB-D registration is mainly achieved by one-to-one correspondence between pixels of a depth image and pixels of an RGB image (i.e., point-by-point correspondence), or registration is performed in one-to-many manner by interpolation between pixels of a depth image and pixels of an RGB image.
However, the conventional registration method has some problems, for example, the calculation amount of the point-by-point calculation of the correspondence relation is large, which results in long time consumption; for another example, the RGB-D registration may be performed on the premise that the entire RGB image has a corresponding depth region, otherwise, the registered image may be defective.
Aiming at the problem of long time consumption of registration, the applicant proposes that if local or specific points of an image can be registered, the registration process can be greatly simplified, and the data processing amount can be reduced so as to improve the registration efficiency. Specifically, the second reference point corresponding to the first reference point of the infrared image in the RGB image is surrounded and cut with the cutting size consistent with the aspect ratio of the infrared image, and the registration can be realized by realizing the correspondence between the first reference point in the infrared image and the second reference point in the RGB image and the correspondence between at least one corner point of the infrared image and at least one corner point of the enlarged cutting area, so that the registration process is simplified and the registration efficiency is improved. In particular, in this registration scheme, since the portion of the RGB image that does not originally have the corresponding depth region does not participate in the subsequent registration process, registration can be completed on the premise that at least a portion of the RGB image does not have the corresponding depth region.
Based on this, the application proposes a registration method for an RGB-D camera module, comprising: obtaining calibration parameters of an RGB-D camera module; the RGB image and source data of a shot target are respectively obtained through an RGB camera and a depth camera of the RGB-D shooting module, wherein the view field of the RGB camera comprises the view field of the depth camera; processing the source data based on the calibration parameters to obtain a depth point cloud and an infrared image, wherein the depth point cloud is aligned with the infrared image; calculating a focal length ratio between an effective focal length of the depth camera and an effective focal length of the RGB camera; determining the size of a region to be cut corresponding to the infrared image in the RGB image based on the size of the infrared image and the focal length ratio; calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image; determining the position of the region to be cropped in the RGB image based on the offset; and amplifying the region to be cut by taking the focal length ratio as an amplifying factor to obtain a registration region registered with the infrared image in the RGB image.
Furthermore, the present application also provides a registration device for an RGB-D camera module, which includes: the device comprises a calibration parameter acquisition unit, a source image acquisition unit, a depth calculation unit, a focal length ratio calculation unit, a size determination unit, an offset calculation unit, a region to be cut determination unit and a registration region generation unit. The calibration parameter acquisition unit is used for acquiring calibration parameters of the RGB-D camera module. The source image acquisition unit is used for respectively acquiring an RGB image and an infrared image of a shot target through an RGB camera and a depth camera of the RGB-D camera module, wherein the field of view of the RGB camera comprises the field of view of the depth camera. The depth calculation unit is used for processing the infrared image based on the calibration parameters to obtain a depth point cloud. The focal length ratio calculating unit is used for calculating a focal length ratio between an effective focal length of the depth camera and an effective focal length of the RGB camera. The size determining unit is used for determining the size of the region to be cut corresponding to the infrared image in the RGB image based on the size of the infrared image and the focal length ratio. The offset amount calculating unit is used for calculating the offset amount between a first reference point in the infrared image and a second reference point in the RGB image. The region to be cropped determining unit is used for determining the position of the region to be cropped in the RGB image based on the offset. The registration area generating unit is used for amplifying the area to be cut by taking the focal length ratio as a magnification factor so as to obtain a registration area registered with the infrared image in the RGB image.
Having described the basic principles of the present application, various non-limiting embodiments of the present application will now be described in detail with reference to the accompanying drawings.
Exemplary registration method
As shown in fig. 1, a registration method for an RGB-D camera module according to an embodiment of the present application is illustrated, and includes: s110, acquiring calibration parameters of an RGB-D camera module; s120, respectively obtaining RGB images and source data of a shot target through an RGB camera and a depth camera of the RGB-D shooting module, wherein the field of view of the RGB camera comprises the field of view of the depth camera; s130, processing the source data based on the calibration parameters to obtain a depth point cloud and an infrared image, wherein the depth point cloud is aligned with the infrared image; s140, calculating a focal length ratio between an effective focal length of the depth camera and an effective focal length of the RGB camera; s150, determining the size of a region to be cut corresponding to the infrared image in the RGB image based on the size of the infrared image and the focal length ratio; s160, calculating the offset between a first reference point in the infrared image and a second reference point in the RGB image; s170, determining the position of the region to be cut in the RGB image based on the offset; and S180, amplifying the region to be cut by taking the focal length ratio as an amplifying factor to obtain a registration region registered with the infrared image in the RGB image.
In step S110, calibration parameters of the RGB-D camera module 60 are obtained. Specifically, the RGB-D camera module 60 includes an RGB camera 61 for capturing RGB information and a depth camera 62 for capturing depth information, the specific type of the depth camera 62 is not limited in this application, and the depth camera 62 may be a structured light camera, a binocular camera, or a TOF (Time of Flight) camera.
The calibration parameters of the RGB-D camera module 60 include a calibration inner parameter and a calibration outer parameter. The calibration internal parameters include a principal point (c) and a focal length (f) of the RGB camera 61, and the calibration external parameters include pose transformation parameters between the RGB camera 61 and the depth camera 62, the pose transformation parameters including a first parameter (R) including axis rotation information and a second parameter (T) including axis translation information. In some embodiments of the present application, the calibration parameters include distortion parameters including tangential distortion parameters (p), radial distortion parameters (k), and fisheye model distortion parameters (θ).
In step S120, an RGB image and source data of a subject are obtained by the RGB camera 61 and the depth camera 62 of the RGB-D camera module 60, respectively. Specifically, an RGB image containing color information of a subject can be obtained by the RGB camera 61. And obtaining a depth image containing depth information through the source data. The source data varies according to the type of the depth camera 62. For example, photoelectric conversion values of different phases of the respective pixel points can be obtained by an i-TOF (Indirect Time of Flight ) camera.
As shown in fig. 2, in the embodiment of the present application, the field of view of the RGB camera 61 includes the field of view of the depth camera 62, so that the field of view range of the RGB image covers the field of view range of the depth image or the infrared image generated by the depth camera 62, so as to facilitate the subsequent registration process. When there is a deviation between the optical axis of the RGB camera 61 and the optical axis of the depth camera 62 in a certain direction (for example, a horizontal direction), the difference in the angle of view of the RGB camera 61 and the depth camera 62 in that direction, and the distance of the RGB camera and the depth camera 62 in that direction will affect the registration effect.
The larger the field angle of the RGB camera 61 and the depth camera 62 in this direction is, the closer the field intersection point of the RGB camera 61 and the depth camera 62 in this direction is to the RGB camera 61 and the depth camera 62, as shown in fig. 3. When the RGB camera 61 and the depth camera 62 have a small angle of view in this direction, the RGB camera 61 and the depth camera 62 have a large field of view intersection distance in this direction.
The farther the RGB camera 61 and the depth camera 62 are in this direction, the farther the field of view intersection of the RGB camera 61 and the depth camera 62 in this direction is from the RGB camera 61 and the depth camera 62. When the RGB camera 61 and the depth camera 62 are farther apart in this direction, the field of view intersection of the RGB camera 61 and the depth camera 62 in this direction is farther apart.
The far field of view intersection distance of the RGB camera 61 and the depth camera 62 in this direction may cause a decrease in image registration accuracy because the field of view range of the RGB image of the subject object in a short distance (e.g., 300 mm) does not completely cover the field of view range of the depth image or the infrared image.
Accordingly, the angles of view of the RGB camera 61 and the depth camera 62, and the distance between the RGB camera 61 and the depth camera 62, need to be limited. In some embodiments of the present application, the difference in the angle of view of the RGB camera 61 and the depth camera 62 in the horizontal direction is greater than 2 °, and the distance between the optical centers of the RGB camera 61 and the depth camera 62 in the horizontal direction is less than 15mm.
In a specific example of the present application, the RGB camera 61 has an angle of view of 80 ° in the vertical direction, an angle of view of 53 ° in the horizontal direction, the depth camera 62 has an angle of view of 75 ° in the vertical direction, an angle of view of 50 ° in the horizontal direction, and a difference between the angles of view of the RGB camera 61 and the depth camera 62 in the horizontal direction is 3 °. In this specific example, the distance of the optical centers of the RGB camera 61 and the depth camera 62 in the horizontal direction is 10mm.
In step S130, the source data is processed based on the calibration parameters to obtain a depth point cloud and an infrared image. Specifically, a depth image and an infrared image can be obtained through calculation by the source data, and the depth image can be converted into a depth point cloud based on the calibration parameters. The size of the depth image is consistent with that of the infrared image, and the depth image corresponds to the pixels of the infrared image one by one. Accordingly, the depth point cloud and the infrared image are aligned, i.e., pixels of the depth point cloud correspond to pixels of the infrared image one-to-one.
In this embodiment of the present application, a coordinate system may be established based on the calibration parameters, so that the pixels of the depth point cloud and the pixels of the depth image correspond to each other. The pixel coordinate system may be established by the following formula:
Figure RE-GDA0003590770130000111
in the process of establishing the coordinate system, distortion correction may be performed. Specifically, the radial distortion correction can be performed by the following formula:
x 0 =x(1+k 1 r 2 +k 2 r 4 +k 3 r 6 ),y 0 =y(1+k 1 r 2 +k 2 r 4 +k 3 r 6 ) Wherein k is 1 ,k 2 ,K 3 Is a radial distortion parameter.
Tangential distortion correction can be performed by the following formula:
x 0 =x+[2p 1 y+p 2 (r 2 +2x 2 )],y 0 =y+[2p 2 x+p 1 (r 2 +2y 2 )]wherein p1, p2 are tangential distortion parameters.
The fisheye model distortion correction can be performed by the following formula:
x 0 =(θ d /r)*a 0 ,y 0 =(θ d /r)*b 0 Wherein θ d =θ(1+k 1 θ 2 +k 2 θ 4 +k 3 θ 6 +k 4 θ 8 ),θ=a tan(r),r 2 =a 0 2 +b 0 2
In the embodiment of the present application, since the infrared image is aligned with the depth image and the depth point cloud, the registration between the depth image or the depth point cloud and the RGB image may be achieved by registering the infrared image and the RGB image. As described above, in the conventional registration method, RGB-D registration is mainly implemented by performing one-to-one correspondence between the pixels of the depth image and the pixels of the RGB image (i.e., a point-by-point correspondence manner), so that the calculation amount of the point-by-point correspondence is large, which results in excessively long time consumption. And, can only realize RGB-D registration on the premise that the whole RGB picture has corresponding depth area.
In particular, the method simplifies the registration process by registering local or specific points of the image, reduces the data processing amount and improves the registration efficiency. Specifically, first, an RGB image is cropped based on the size of the infrared image and the focal length ratio; then, the region to be cropped is enlarged and enlarged to conform to the size of the infrared image. In the embodiment of the application, the to-be-cut region of the RGB image is used for registration, and the part of the RGB image, which originally has no corresponding depth region, does not participate in the subsequent registration process, so that the registration can be completed on the premise that at least one part of the RGB image has no corresponding depth region.
More specifically, in the embodiment of the present application, the field of view of the RGB image is wide, and the RGB image needs to be cropped. Since the focal lengths of the depth camera 62 and the RGB camera 61 are different, a focal length ratio between the effective focal length of the depth camera 62 and the effective focal length of the RGB camera 61 is determined before clipping the RGB. Accordingly, in step S140, a focal length ratio between the effective focal length of the depth camera 62 and the effective focal length of the RGB camera 61 is calculated. Here, the effective focal length means: distance from lens center to focus.
In step S150, the size of the region to be cut corresponding to the infrared image in the RGB image is determined based on the size of the infrared image and the focal length ratio. In an embodiment of the present application, the size of the region to be cropped is a ratio of the size of the infrared image to the focal length ratio. That is, if the width of the infrared image is w, the height is h, the width of the region to be cut is w/a, and the height is h/a, where a represents the focal length ratio. The transverse-longitudinal ratio of the region to be cut is consistent with that of the infrared image, so that the region to be cut can be consistent with the size of the infrared image after being amplified by taking the focal length ratio as the amplification factor.
In the embodiment of the application, the RGB image is cropped around a specific reference point, and the offset of the region to be cropped is determined, so as to determine the positional relationship between the region to be cropped and the infrared image. Accordingly, in step S160, an offset between a first reference point in the infrared image and a second reference point in the RGB image is calculated.
As shown in fig. 4 to 5D, in a specific example of the present application, the optical center of the infrared image is taken as the first reference point, and the optical center of the RGB image is taken as the second reference point. Calculating an offset between the first reference point and the second reference point with the following formula: x=x2-x 1/a, y=y2-y 1/a, where (x 1, y 1) represents the optical center coordinates of the infrared image, (x 2, y 2) represents the optical center coordinates of the RGB image, a represents the focal length ratio, x represents the offset in the u direction, and y represents the offset in the v direction.
As shown in fig. 6 to 8D, in another specific example of the present application, a center point of an effective area of a center area of the infrared image is taken as the first reference point, and a center point of a projection area corresponding to the effective area of the center area of the infrared image in the RGB image is taken as the second reference point, so as to avoid that the first reference point corresponding to the second reference point is a flying point, or is a pixel point without depth information.
In this particular example, the offset between the first reference point and the second reference point is calculated with the following formula: x=x2 '-x1'/a, y=y2 '-y1'/a, where (x 1', y 1') denotes coordinates of a center point of an effective area in a center area of the infrared image, (x 2', y 2') denotes coordinates of a center point of a projection area of the RGB image, a denotes the focal length ratio, x denotes an offset in a u direction, and y denotes an offset in a v direction.
In this specific example, first, a point cloud region having a preset size of the depth point cloud in a central region thereof is extracted; then, counting the depth value of each pixel point in the point cloud area to obtain a center point of the point cloud area, and further obtaining a center point of an effective area in the center area of the infrared image as the first reference point, wherein the center point of the effective area in the center area of the infrared image is aligned with the center point of the point cloud area; then, the point cloud area is projected to a coordinate system of the RGB image based on the calibration parameters of the RGB-D camera module 60 to obtain a projection area; then, the values of the pixel points in the projection area are counted to obtain the center point of the projection area as the second reference point.
Accordingly, step S160 includes: s161, extracting a point cloud area with a preset size of the depth point cloud in a central area of the depth point cloud; s162, counting depth values of all pixel points in the point cloud area to obtain a center point of the point cloud area, and further obtaining a center point of an effective area in a center area of the infrared image as the first reference point, wherein the center point of the effective area in the center area of the infrared image is aligned with the center point of the point cloud area; s163, projecting the point cloud area to a coordinate system of the RGB image based on the calibration parameters of the RGB-D camera module 60 to obtain a projection area; and S164, counting the values of all pixel points in the projection area to obtain a center point of the projection area as the second reference point.
In step S161, a point cloud region with a suitable size should be extracted to avoid the undersize projection stability from decreasing and avoid the oversized calculation amount from being large. In some embodiments of the present application, the shape of the point cloud area is square, and the size thereof is 10×10 to 50×50, and in a specific example, the size of the point cloud area is 20×20.
In step S162, the effective area of the RGB image refers to an area with a depth value other than zero in the central area corresponding to the depth point cloud, that is, the depth value of each pixel point of the point cloud area is not zero. Accordingly, the maximum depth value and the minimum depth value (excluding zero value) of the pixel points in the horizontal direction in the center area of the depth point cloud are counted, and the maximum depth value and the minimum depth value (excluding zero value) of the pixel points in the vertical direction are counted, so as to obtain the center point of the point cloud area. Since the depth point cloud is aligned with the infrared image, a center point of an effective area of a center area of the infrared image may be determined by a center point of the point cloud area.
After determining the offset between the first reference point and the second reference point, the position of the region to be cut in the RGB image may be determined, and in this embodiment of the present application, the offset is used as the coordinate of a corner point of the region to be cut in the RGB image, and the position of the region to be cut in the RGB image is determined based on the coordinate of the corner point and the size of the region to be cut.
Accordingly, step S170 includes: taking the offset as the coordinate of one corner point of the region to be cut in the RGB image; and determining the position of the region to be cut in the RGB image based on the coordinates of the corner points and the size of the region to be cut.
In step S180, the region to be cut is enlarged with the focal length ratio as a magnification factor to obtain a registration region in the RGB image registered with the infrared image. Because the transverse-longitudinal ratio of the region to be cut is consistent with that of the infrared image, the region to be cut can be consistent with the size of the infrared image after being amplified by taking the focal length ratio as the amplification factor. And, since the offset (x=x2-x 1/a, y=y2-y 1/a) is taken as the coordinate of one corner point of the region to be clipped in the RGB image, the second reference point corresponds to the first reference point after the region to be clipped is enlarged by taking the focal length ratio as the magnification factor, and the corner point of the region to be clipped in the RGB image corresponds to the corresponding corner point of the infrared image, so as to obtain a registration region in the RGB image registered with the infrared image.
Specifically, firstly, the region to be cut is rotated based on the calibration parameters so as to carry out rotation correction, the z direction is guaranteed to be free from rotation, and the rotation angle is obtained by the calibration parameters. And then, the region to be cut is enlarged with the focal length proportion as the magnification factor so as to obtain the registration region. Specifically, the amplification may be performed by bilinear interpolation, or may be performed by other means, which is not limited in this application.
In summary, the rapid registration method for the RGB-D camera module is explained, which reduces the data processing amount to a greater extent by means of specific point registration or local registration, so as to improve the registration efficiency. The rapid registration method for the RGB-D camera module can complete registration on the premise that at least one part of an RGB image has no corresponding depth area.
Exemplary RGB-D camera Module
According to another aspect of the present application, an RGB-D camera module 60 is also provided. The RGB-D camera module 60 includes an RGB camera 61, a depth camera 62, and a data processing device 63. The positional relationship of the RGB camera 61 and the depth camera 62 is: the field of view of the RGB camera 61 includes the field of view of the depth camera 62.
Specifically, in some embodiments of the present application, the difference in the angle of view of the RGB camera 61 and the depth camera 62 in the horizontal direction is greater than 2 °, and the distance of the optical centers of the RGB camera 61 and the depth camera 62 in the horizontal direction is less than 15mm.
In a specific example of the present application, the RGB camera 61 has an angle of view of 80 ° in the vertical direction, an angle of view of 53 ° in the horizontal direction, the depth camera 62 has an angle of view of 75 ° in the vertical direction, an angle of view of 50 ° in the horizontal direction, and a difference between the angles of view of the RGB camera 61 and the depth camera 62 in the horizontal direction is 3 °. In this specific example, the distance of the optical centers of the RGB camera 61 and the depth camera 62 in the horizontal direction is 10mm.
The data processing device 63 is configured to perform the fast registration method for RGB-D camera modules illustrated in fig. 1 to 8D. Here, the rapid registration method for an RGB-D camera module has been described in detail in the description of the rapid registration method for an RGB-D camera module illustrated above with reference to fig. 1 to 8D, and thus, repetitive descriptions thereof will be omitted.
Exemplary embodimentsRegistrationDevice and method for controlling the same
According to still another aspect of the present application, there is also provided a rapid registration apparatus 10 for an RGB-D camera module, as shown in fig. 9, the rapid registration apparatus 10 for an RGB-D camera module includes: a calibration parameter acquisition unit 11, a source image acquisition unit 12, a depth calculation unit 13, a focal length ratio calculation unit 14, a size determination unit 15, an offset amount calculation unit 16, a region to be clipped determination unit 17, and a registration region generation unit 18.
Specifically, the calibration parameter obtaining unit 11 is configured to obtain calibration parameters of the RGB-D camera module. The source image acquiring unit 12 is configured to acquire an RGB image and an infrared image of a subject through an RGB camera 61 and a depth camera 62 of the RGB-D camera module 60, respectively, wherein a field of view of the RGB camera 61 includes a field of view of the depth camera 62. The depth calculation unit 13 is configured to process the infrared image based on the calibration parameter to obtain a depth point cloud. The focal length ratio calculation unit 14 is configured to calculate a focal length ratio between an effective focal length of the depth camera 62 and an effective focal length of the RGB camera 61. The size determining unit 15 is configured to determine, based on the size of the infrared image and the focal length ratio, a size of a region to be cropped in the RGB image, the region corresponding to the infrared image; the offset amount calculation unit 16 is configured to calculate an offset amount between a first reference point in the infrared image and a second reference point in the RGB image. The region to be cropped determining unit 17 is configured to determine a position of the region to be cropped in the RGB image based on the offset amount. The registration area generating unit 18 is configured to enlarge the area to be cut by the focal length ratio as a magnification factor to obtain a registration area in the RGB image registered with the infrared image.
The specific functions of the respective units have been described in detail in the description of the rapid registration method for an RGB-D camera module described above with reference to fig. 1 to 8D, and thus, repetitive descriptions thereof will be omitted.
In summary, the rapid registration device 10 for RGB-D camera modules is illustrated, and the rapid registration device 10 for RGB-D camera modules reduces the data processing amount to a greater extent by means of specific point registration or local registration, so as to improve the registration efficiency. And registration may also be accomplished on the premise that at least a portion of the RGB image has no corresponding depth region.
Exemplary electronic device
According to still another aspect of the present application, there is also provided an electronic device 80, the electronic device 80 including: a memory 81 and a processor 82, in which memory 81 computer program instructions are stored which, when run by the processor 82, cause the processor 82 to perform the fast registration method for an RGB-D camera module as illustrated with reference to fig. 1 to 8D. Here, the rapid registration method for an RGB-D camera module has been described in detail in the description of the rapid registration method for an RGB-D camera module illustrated above with reference to fig. 1 to 8D, and thus, repetitive descriptions thereof will be omitted.
In summary, the electronic device 80 is illustrated, and the electronic device 80 can perform a fast registration method for an RGB-D camera module to improve registration efficiency.
Those skilled in the art will appreciate that the embodiments of the present application described above and shown in the drawings are by way of example only and not limitation. The objects of the present application have been fully and effectively achieved. The functional and structural principles of the present application have been shown and described in the examples and the embodiments of the present application are susceptible to any variations or modifications without departing from the principles.

Claims (15)

1. A rapid registration method for an RGB-D camera module, comprising:
obtaining calibration parameters of an RGB-D camera module;
the RGB image and source data of a shot target are respectively obtained through an RGB camera and a depth camera of the RGB-D shooting module, wherein the view field of the RGB camera comprises the view field of the depth camera;
processing the source data based on the calibration parameters to obtain a depth point cloud and an infrared image, wherein the depth point cloud is aligned with the infrared image;
calculating a focal length ratio between an effective focal length of the depth camera and an effective focal length of the RGB camera;
Determining the size of a region to be cut corresponding to the infrared image in the RGB image based on the size of the infrared image and the focal length ratio;
calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image;
determining the position of the region to be cropped in the RGB image based on the offset; and
and amplifying the region to be cut by taking the focal length ratio as a magnification factor to obtain a registration region in the RGB image, which is registered with the infrared image.
2. The method for rapid registration of an RGB-D camera module of claim 1, wherein the first reference point is an optical center of the infrared image and the second reference point is an optical center of the RGB image.
3. The method for rapid registration of an RGB-D camera module of claim 2, wherein calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image comprises:
calculating an offset between the first reference point and the second reference point with the following formula: x=x2-x 1/a, y=y2-y 1/a, where (x 1, y 1) represents the optical center coordinates of the infrared image, (x 2, y 2) represents the optical center coordinates of the RGB image, a represents the focal length ratio, x represents the offset in the u direction, and y represents the offset in the v direction.
4. A method of rapid registration for an RGB-D camera module according to claim 3, wherein determining the location of the region to be cropped in the RGB image based on the offset comprises:
taking the offset as the coordinate of one corner point of the region to be cut in the RGB image; and
and determining the position of the region to be cut in the RGB image based on the coordinates of the corner points and the size of the region to be cut.
5. The method for rapid registration of an RGB-D camera module of claim 1, wherein calculating an offset between a first reference point in the infrared image and a second reference point in the RGB image comprises:
extracting a point cloud area with a preset size of the depth point cloud in the central area of the depth point cloud;
counting the depth value of each pixel point in the point cloud area to obtain a central point of the point cloud area, and further obtaining a central point of an effective area in the central area of the infrared image as the first reference point, wherein the central point of the effective area in the central area of the infrared image is aligned with the central point of the point cloud area;
projecting the point cloud area to a coordinate system of the RGB image based on the calibration parameters of the RGB-D camera module to obtain a projection area; and
And counting the values of all pixel points in the projection area to obtain the central point of the projection area as the second reference point.
6. The method for rapid registration of an RGB-D camera module of claim 5, wherein the offset between the first reference point and the second reference point is calculated with the following formula: x=x2 '-x1'/a, y=y2 '-y1'/a, where (x 1', y 1') denotes coordinates of a center point of an effective area in a center area of the infrared image, (x 2', y 2') denotes coordinates of a center point of a projection area of the RGB image, a denotes the focal length ratio, x denotes an offset in a u direction, and y denotes an offset in a v direction.
7. The method for rapid registration of an RGB-D camera module of claim 6, wherein determining the location of the region to be cropped in the RGB image based on the offset comprises:
taking the offset as the coordinate of one corner point of the region to be cut in the RGB image; and
and determining the position of the region to be cut in the RGB image based on the coordinates of the corner points and the size of the region to be cut.
8. The method for rapid registration of RGB-D camera modules of claim 5, wherein the point cloud area is square in shape and has a size of 10 x 10 to 50 x 50.
9. The method for rapid registration of an RGB-D camera module of claim 1, wherein magnifying the region to be cropped with the focal length ratio as a magnification to obtain a registered region in the RGB image that is registered with the infrared image comprises:
rotating the region to be cut based on the calibration parameters; and
and amplifying the region to be cut by taking the focal length proportion as an amplifying factor so as to obtain the registration region.
10. The rapid registration method for an RGB-D camera module of claim 1, wherein a difference between angles of view of the RGB camera and the depth camera in a horizontal direction is greater than 2 °, and a distance between center points of the RGB camera and the depth camera is 10mm.
11. An RGB-D camera module, comprising:
an RGB camera;
a depth camera, a field of view of the RGB camera including a field of view of the depth camera; and
data processing apparatus for performing the fast registration method for an RGB-D camera module according to any one of claims 1-10.
12. The RGB-D camera module of claim 12, wherein a difference in field angle of the RGB camera and the depth camera in a horizontal direction is greater than 2 °.
13. The RGB-D camera module of claim 12, wherein the distance of the optical centers of the RGB camera and the depth camera in the horizontal direction is less than 15mm.
14. A rapid registration device for an RGB-D camera module, comprising:
the calibration parameter acquisition unit is used for acquiring calibration parameters of the RGB-D camera module;
a source image obtaining unit, configured to obtain an RGB image and an infrared image of a subject through an RGB camera and a depth camera of the RGB-D camera module, where a field of view of the RGB camera includes a field of view of the depth camera;
the depth calculation unit is used for processing the infrared image based on the calibration parameters to obtain a depth point cloud;
a focal length ratio calculating unit for calculating a focal length ratio between an effective focal length of the depth camera and an effective focal length of the RGB camera;
a size determining unit, configured to determine a size of a region to be cropped corresponding to the infrared image in the RGB image based on the size of the infrared image and the focal length ratio;
an offset amount calculating unit for calculating an offset amount between a first reference point in the infrared image and a second reference point in the RGB image;
A region to be cropped determining unit configured to determine a position of the region to be cropped in the RGB image based on the offset; and
and the registration area generating unit is used for amplifying the area to be cut by taking the focal length ratio as a magnification factor so as to obtain a registration area registered with the infrared image in the RGB image.
15. An electronic device, comprising:
a processor; and
a memory having stored therein computer program instructions that, when executed by the processor, cause the processor to perform the fast registration method for an RGB-D camera module of any one of claims 1-10.
CN202111641403.0A 2021-12-29 2021-12-29 Rapid registration method and device for RGB-D camera module and RGB-D camera module Pending CN116416283A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111641403.0A CN116416283A (en) 2021-12-29 2021-12-29 Rapid registration method and device for RGB-D camera module and RGB-D camera module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111641403.0A CN116416283A (en) 2021-12-29 2021-12-29 Rapid registration method and device for RGB-D camera module and RGB-D camera module

Publications (1)

Publication Number Publication Date
CN116416283A true CN116416283A (en) 2023-07-11

Family

ID=87053150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111641403.0A Pending CN116416283A (en) 2021-12-29 2021-12-29 Rapid registration method and device for RGB-D camera module and RGB-D camera module

Country Status (1)

Country Link
CN (1) CN116416283A (en)

Similar Documents

Publication Publication Date Title
CN109767474B (en) Multi-view camera calibration method and device and storage medium
WO2020259271A1 (en) Image distortion correction method and apparatus
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
WO2019232793A1 (en) Two-camera calibration method, electronic device and computer-readable storage medium
JP2019510234A (en) Depth information acquisition method and apparatus, and image acquisition device
CN109859137B (en) Wide-angle camera irregular distortion global correction method
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN110136205B (en) Parallax calibration method, device and system of multi-view camera
WO2022126374A1 (en) Image annotation method and apparatus, electronic device, and computer readable storage medium
CN110264510A (en) A method of image zooming-out depth of view information is acquired based on binocular
CN111311658B (en) Image registration method and related device for dual-light imaging system
CN115379122A (en) Video content dynamic splicing method, system and storage medium
CN107680035B (en) Parameter calibration method and device, server and readable storage medium
CN105335959B (en) Imaging device quick focusing method and its equipment
JP7311407B2 (en) Posture estimation device and posture estimation method
CN116503567B (en) Intelligent modeling management system based on AI big data
JP2009301181A (en) Image processing apparatus, image processing program, image processing method and electronic device
CN110505467B (en) Binocular camera module matched with adaptive filter and stereo matching method thereof
CN109990756B (en) Binocular ranging method and system
CN111630569B (en) Binocular matching method, visual imaging device and device with storage function
CN111696141A (en) Three-dimensional panoramic scanning acquisition method and device and storage device
CN116416283A (en) Rapid registration method and device for RGB-D camera module and RGB-D camera module
JP2021033605A (en) Image processor and method for processing image
CN110188756B (en) Product positioning method
JPH09231369A (en) Picture information input device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination