CN109579794B - System and method for selecting key frame by iterative closest point method - Google Patents
System and method for selecting key frame by iterative closest point method Download PDFInfo
- Publication number
- CN109579794B CN109579794B CN201710897413.8A CN201710897413A CN109579794B CN 109579794 B CN109579794 B CN 109579794B CN 201710897413 A CN201710897413 A CN 201710897413A CN 109579794 B CN109579794 B CN 109579794B
- Authority
- CN
- China
- Prior art keywords
- frame
- current
- point method
- closest point
- iterative closest
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
Abstract
A system for selecting a key frame by an iterative closest point method comprises a reference frame selector for generating a reference frame according to a current frame and the current key frame; an iterative closest point method loop unit, which executes the iterative closest point method aiming at the reference picture frame and the current picture frame to generate the posture of the current picture frame; and a key frame updating unit for generating a new key frame according to the difference condition between the current frame posture and the reference frame posture.
Description
Technical Field
The present invention relates to visual object measurement, and more particularly, to a system and method for selecting keyframes (keyframes) by Iterative Closest Point (ICP).
Background
Visual ranging is applicable to robotics (robotics) and computer vision, where images captured by an analysis camera (e.g., rgb-depth camera) are used to determine the position and orientation of a robot. The source frame and the target frame are aligned (align) using an iterative closest point method (ICP) to estimate the robot's movement. Fig. 1 shows a block diagram of a conventional system 100 for Visual ranging disclosed in "Fast Visual measuring Using Intensity-Assisted Iterative Closest Point method" proposed by leersi (ShileLi), et al, AND disclosed in july 2016, journal of ROBOTICS AND AUTOMATION (IEEE ROBOTICS AND AUTOMATION LETTERS) published in the institute of electrical AND electronics engineers, volume 1, No. 2, the contents of which are considered to be part of this specification.
The system 100 can be used to reduce the computation cost and reduce the effect of outliers (outliers) or outliers. As shown in the visual ranging system 100 of FIG. 1, salient point (salient point) selection 11 is performed on the source frames to provide information useful for Iterative Closest Point (ICP). Then, a correspondence (coreespondance) matching 12 is performed to determine a matching point. Weighting (weighting)13 is performed on the basis of robust static (robust static) to the corresponding set (corresponding pair). Incremental transformation (incremental transformation)14 is performed to reduce the distance between established correspondences. The above operations 11 to 14 are repeatedly performed until the incremental conversion is less than the threshold value or the maximum allowable number of executions has been reached.
Since sensor noise always causes errors in the estimation, the conventional frame-to-frame calibration method essentially causes the accumulation of drift (drift). To overcome the drift problem, Christian Kerl (Christian Kerl) et al proposed "Dense Visual synchronous positioning and mapping for RGB-depth Cameras" (published in 2013 in international conference bulletin of smart Robot system (IROS)), the contents of which are considered as part of the present specification. A keyframe based pose-synchronized positioning and mapping (SLAM) approach may estimate the transition between the current image and the keyframe to limit regional drift. As long as the camera is close enough to the key frame, drift does not accumulate. The synchronous positioning and mapping (SLAM) system requires additional key frame selection, loop closure (loop closure) detection and verification, and map optimization.
In the conventional method, when the current image cannot be matched with the nearest key frame, a new key frame is generated. However, after determining the conversion between the current frame and the latest key frame, the selected key frame may generate large errors, thereby causing serious tracking loss. Therefore, a novel mechanism is needed to avoid selecting a key frame with large error.
Disclosure of Invention
In view of the foregoing, it is an object of the present invention to provide a system and method for Iterative Closest Point (ICP) to select a key frame with minimum error, which is used to construct or update a map under unknown environment and synchronously track and locate.
The purpose of the invention is realized by adopting the following technical scheme.
According to an embodiment of the present invention, a system for selecting a key frame by an iterative closest point method includes a reference frame selector, an iterative closest point method loop unit, and a key frame update unit. The reference frame selector generates a reference frame according to the current frame and the current key frame. The iterative closest point method loop unit executes the iterative closest point method aiming at the reference picture frame and the current picture frame so as to generate the posture of the current picture frame. The key frame updating unit generates a new key frame according to the difference condition of the posture of the current frame and the posture of the reference frame.
The object of the invention can be further achieved by the following technical measures.
The above system for selecting a key frame by iterative closest point method, wherein the reference frame selector performs the following steps:
executing at least one iteration nearest point method for the current frame and the current key frame; and
if the matching quality of the result obtained by executing the iteration closest point method does not reach the preset level, selecting the current standby picture frame as the reference picture frame; otherwise, the current key frame is selected as the reference frame.
The above system for selecting a key frame by iterative closest point method, wherein if the number of normal values in the result obtained by performing the iterative closest point method is less than a predetermined value, it indicates that the matching quality does not reach the predetermined level.
The above system for selecting a key frame by iterative closest point method, wherein the current standby frame is a temporally previous frame.
The above system for selecting a key frame by iterative closest point method, wherein the difference condition comprises a displacement, a rotation angle and a depth distance between the posture of the current frame and the posture of the reference frame.
The system for selecting a key frame according to the iterative closest point method described above, wherein the difference condition comprises:
displacement times condition: displacement (t) > η 1;
the rotation time conditions are as follows: rotation (R θ) > η 2;
wherein t represents displacement, θ represents rotation angle, R represents depth distance, and η 1 and η 2 are preset thresholds of the difference.
The above system for selecting a key frame by iterative closest point method, wherein if at least one sub-condition is satisfied, the current frame is used as the new key frame, otherwise the reference frame is used as the new key frame.
The system for selecting a key frame by iterative closest point method further comprises a spare frame updating unit for providing a new spare frame according to the current frame.
The system for selecting a key frame by using the iterative closest point method as described above, wherein the spare frame updating unit comprises a storage device for temporarily storing the current frame as a new spare frame for performing the next iterative closest point method.
The purpose of the invention can also be realized by adopting the following technical scheme.
According to an embodiment of the present invention, a method for selecting a key frame by using an iterative closest point method includes:
generating a reference frame according to the current frame and the current key frame;
executing an iterative closest point method aiming at the reference frame and the current frame to generate the posture of the current frame; and
and generating a new key frame according to the difference condition of the posture of the current frame and the posture of the reference frame.
The object of the invention can be further achieved by the following technical measures.
The method for selecting a key frame by iterative closest point method as described above, wherein the step of generating the reference frame comprises:
executing at least one iteration nearest point method for the current frame and the current key frame; and
if the matching quality of the result obtained by executing the iteration closest point method does not reach the preset level, selecting the current standby picture frame as the reference picture frame; otherwise, the current key frame is selected as the reference frame.
The above method for selecting a key frame by using the iterative closest point method is applied, wherein if the number of normal values in the result obtained by performing the iterative closest point method is less than a predetermined value, it indicates that the matching quality does not reach the predetermined level.
The method for selecting a key frame by the iterative closest point method is described above, wherein the current standby frame is a temporally previous frame.
The above method for selecting a key frame by iterative closest point method, wherein the difference condition comprises a displacement, a rotation angle and a depth distance between the posture of the current frame and the posture of the reference frame.
The above method for selecting a key frame by iterative closest point method, wherein the difference condition comprises:
displacement times condition: displacement (t) > η 1;
the rotation time conditions are as follows: rotation (R θ) > η 2;
wherein t represents displacement, θ represents rotation angle, R represents depth distance, and η 1 and η 2 are preset thresholds of the difference.
The method for selecting a key frame according to the iterative closest point method described above, wherein if at least one sub-condition is satisfied, the current frame is used as the new key frame, otherwise the reference frame is used as the new key frame
Drawings
Fig. 1 shows a block diagram of a conventional visual ranging system.
Fig. 2 is a block diagram illustrating a system for Iterative Closest Point (ICP) selection of key frames according to an embodiment of the invention.
FIG. 3 is a flowchart of the reference frame selector of the second drawing.
FIG. 4 is a flowchart of a key frame update unit (second diagram) according to an embodiment of the present invention.
[ description of main element symbols ]
100: visual ranging system 11: salient point selection
12: and (3) corresponding matching 13: weighting of corresponding groups
14: incremental conversion 200: system for controlling a power supply
21: the reference frame selector 211: performing an iterative closest point method
212: whether the number of normal values is less than the preset value 22: iteration closest point method loop unit
23: the key frame updating unit 231: difference condition
24: spare frame update unit ICP: iterative closest point method
C: current frame K: key picture frame
K': new key frame B: spare picture frame
B': new standby frame S: reference frame
Pc: posture n of the current frame: preset value
t: displacement R: depth distance
θ: rotation angle η 1: critical value
Eta 1: critical value
Detailed Description
Fig. 2 is a block diagram illustrating a system 200 for Iterative Closest Point (ICP) selection of keyframes according to an embodiment of the present invention. Iterative closest point method (ICP) can generally be used to reduce the difference between two point clusters (clouds of points). The system 200 of the present embodiment can be applied to a simultaneous localization and mapping (SLAM), which can be used to construct or update a map in an unknown environment and to synchronously track localization, for example, for robot mapping and navigation.
The blocks of system 200 may be implemented using circuitry, computer software, or a combination thereof. For example, at least a portion of system 200 may be implemented in a digital image processor. As another example, at least a portion of system 200 may be implemented using computer instructions. In one embodiment, the system 200 may be adapted for Augmented Reality (AR) devices. The hardware elements of an augmented reality device primarily include a processor (e.g., an image processor), a display (e.g., a head mounted display device), and a sensor (e.g., a color-depth camera, such as a red, green, blue, and depth-available red, green, blue-depth camera). Wherein the sensor or camera captures a scene to generate an image frame (abbreviated as frame), which is then fed to the processor to perform the operations of the system 200, thereby generating an augmented reality on the display.
In this embodiment, the system 200 may include a reference frame selector 21 for generating a reference frame S according to the current frame C and the current key frame K. The current frame C may be provided by a camera (e.g., a rgb-depth camera).
Fig. 3 shows a flowchart of the reference frame selector 21 of fig. 2. In step 211, at least one iterative closest point method (ICP) is performed for the current frame C and the current key frame K. In the present embodiment, the iterative closest point method (ICP) is performed only once for the current frame C and the current key frame K. The operational details of the iterative closest point method (ICP) can be found in the aforementioned "fast visual ranging using intensity-aided iterative closest point method" and "dense visual synchronized positioning and mapping of red, green, blue-depth cameras". The operation details of the iterative closest point method (ICP) can also be referred to as "multiple view Registration for Large Data set" proposed by kary pili (Kari villi), published in october 1999, published in three-dimensional Digital image and model building Second International Conference (Second International Conference on 3D Digital Imaging and Modeling); and "depth camera tracking" proposed by franstowa pamolor (Francois pomlereau) et al: a Parameter study of the Fast iterative closest point method (Tracking a Depth Camera: Parameter optimization for Fast ICP), published in September 2011, published in the institute of Electrical and electronics Engineers/International Conference on Intelligent Robots and Systems of the Japan robot society (IEEE/RSJ International Conference on Intelligent Robots and Systems), the contents of which are considered as part of this specification.
Then, in step 212, if the number of normal values (inliers or intra-group values) in the result obtained by performing the iterative closest point method (ICP) is less than a predetermined value n (indicating that the matching quality does not reach a predetermined level), the current spare frame B is selected as the reference frame (i.e., S ═ B); otherwise, the current key frame K is selected as the reference frame (i.e., S ═ K). In other words, if the current frame C is not aligned (align) in the current key frame K (i.e., the matching quality does not reach the predetermined level), the current spare frame B is used in the subsequent iterative closest point method (ICP); otherwise, the current key frame K is used to continue the iterative closest point method (ICP).
In this embodiment, if the Euclidean distance does not fall within the proportion of the matching distance, the set of matching points is considered to be a normal value. A larger number of normal values indicates a higher matching quality. In this embodiment, the previous frame temporarily stored in the storage device (e.g. buffer) is used as the current spare frame B.
The system 200 of the present embodiment may include an Iterative Closest Point (ICP) loop unit 22, which performs an Iterative Closest Point (ICP) operation on the reference frame S (output by the reference frame selector 21) and the current frame C. The Iterative Closest Point (ICP) loop unit 22 thus generates the pose Pc and the error value for the current frame. Details of the operation of the iterative closest point method (ICP) loop unit 22 can be found in the above-mentioned data.
The system 200 of the embodiment may include a key frame update unit 23, which generates a new key frame K' according to a difference (offset or difference) between the orientation Pc of the current frame and the orientation Ps of the reference frame. The fourth diagram shows a flowchart of the key frame updating unit 23 (the second diagram) according to an embodiment of the invention. In step 231, a difference condition between the posture Pc of the current frame and the posture Ps of the reference frame is determined. The difference condition of this embodiment can be expressed as follows:
displacement (t) > η 1
Rotation (R θ) > η 2
Where t represents a displacement (translation), θ represents a rotation (rotation) angle, R represents a depth distance, and η 1 and η 2 are predetermined thresholds of the difference.
If the difference condition is satisfied (indicating that the current frame no longer matches the latest key frame K), then the current frame C is treated as a new key frame K '(i.e., K' ═ C); otherwise, the reference frame S is used as the new key frame K '(i.e., K' is S).
It is noted that the difference condition of step 231 includes two sub-conditions: (1) the displacement order condition and (2) the rotation order condition. In one embodiment, the difference condition is satisfied when at least one of the sub-conditions is satisfied. In another embodiment, the difference condition is satisfied when both sub-conditions are satisfied.
According to one feature of this embodiment, the rotation sub-condition (i.e., rotation (R θ) > η 2) considers both the depth distance R and the rotation angle θ. The reason for considering both the depth distance R and the rotation angle θ is that different depth distances feel different movements with the same rotation angle θ. In other words, the two points with different depth distances are different.
The system 200 of the present embodiment may comprise a spare frame update unit 24 for providing a new spare frame B'. In this embodiment, the frame backup update unit 24 may include a storage device (e.g., a buffer) that temporarily stores the current frame as a new frame backup B' when performing the next iterative closest point method (ICP).
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (12)
1. A system for selecting a key frame using an iterative closest point method, comprising:
a reference frame selector for generating a reference frame according to the current frame and the current key frame;
an iterative closest point method loop unit, which executes the iterative closest point method aiming at the reference picture frame and the current picture frame to generate the posture of the current picture frame; and
a key frame updating unit for generating a new key frame according to the difference condition between the current frame posture and the reference frame posture;
wherein the reference frame selector performs the following steps:
executing at least one iteration nearest point method for the current frame and the current key frame; and
if the matching quality of the result obtained by executing the iteration closest point method does not reach the preset level, selecting the current standby picture frame as the reference picture frame; otherwise, selecting the current key picture frame as the reference picture frame;
wherein, if the number of normal values in the result obtained by performing the iterative closest point method is less than the predetermined value, it means that the matching quality does not reach the predetermined level.
2. The system of claim 1, wherein the current spare frame is a temporally previous frame.
3. The system of claim 1, wherein the difference conditions comprise a displacement, a rotation angle and a depth distance between the pose of the current frame and the pose of the reference frame.
4. The system of claim 3, wherein the difference condition comprises:
displacement times condition: displacement (t) > η 1;
the rotation time conditions are as follows: rotation (R θ) > η 2;
wherein t represents displacement, θ represents rotation angle, R represents depth distance, and η 1 and η 2 are preset thresholds of the difference.
5. The system according to claim 4, wherein said current frame is considered as said new keyframe if at least one sub-condition is satisfied, and said reference frame is considered as said new keyframe otherwise.
6. The system of claim 1, further comprising a spare frame update unit for providing a new spare frame based on the current frame.
7. The system of claim 6, wherein the candidate frame update unit comprises a storage device for temporarily storing the current frame as a new candidate frame for the next iteration of the nearest point method.
8. A method for selecting a key frame by iterative closest point method, comprising:
generating a reference frame according to the current frame and the current key frame;
executing an iterative closest point method aiming at the reference frame and the current frame to generate the posture of the current frame; and
generating a new key frame according to the difference condition of the posture of the current frame and the posture of the reference frame;
wherein the step of generating the reference frame comprises:
executing at least one iteration nearest point method for the current frame and the current key frame; and
if the matching quality of the result obtained by executing the iteration closest point method does not reach the preset level, selecting the current standby picture frame as the reference picture frame; otherwise, selecting the current key picture frame as the reference picture frame;
wherein if the number of normal values in the result obtained by performing the iterative closest point method is less than the predetermined value, it means that the matching quality does not reach the predetermined level.
9. The method of claim 8, wherein the current spare frame is a temporally previous frame.
10. The method of claim 8, wherein the difference conditions comprise a shift, a rotation angle and a depth distance between the pose of the current frame and the pose of the reference frame.
11. The method of claim 10, wherein the difference condition comprises:
displacement times condition: displacement (t) > η 1;
the rotation time conditions are as follows: rotation (R θ) > η 2;
wherein t represents displacement, θ represents rotation angle, R represents depth distance, and η 1 and η 2 are preset thresholds of the difference.
12. The method of claim 11, wherein the current frame is considered as the new keyframe if at least one sub-condition is satisfied, and the reference frame is considered as the new keyframe otherwise.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710897413.8A CN109579794B (en) | 2017-09-28 | 2017-09-28 | System and method for selecting key frame by iterative closest point method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710897413.8A CN109579794B (en) | 2017-09-28 | 2017-09-28 | System and method for selecting key frame by iterative closest point method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109579794A CN109579794A (en) | 2019-04-05 |
CN109579794B true CN109579794B (en) | 2021-03-23 |
Family
ID=65912893
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710897413.8A Active CN109579794B (en) | 2017-09-28 | 2017-09-28 | System and method for selecting key frame by iterative closest point method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109579794B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622762A (en) * | 2011-01-31 | 2012-08-01 | 微软公司 | Real-time camera tracking using depth maps |
CN104050712A (en) * | 2013-03-15 | 2014-09-17 | 索尼公司 | Method and apparatus for establishing three-dimensional model |
WO2015005577A1 (en) * | 2013-07-09 | 2015-01-15 | 삼성전자 주식회사 | Camera pose estimation apparatus and method |
CN104395932A (en) * | 2012-06-29 | 2015-03-04 | 三菱电机株式会社 | Method for registering data |
CN105453559A (en) * | 2013-04-16 | 2016-03-30 | 点积公司 | Handheld portable optical scanner and method of using |
CN105989604A (en) * | 2016-02-18 | 2016-10-05 | 合肥工业大学 | Target object three-dimensional color point cloud generation method based on KINECT |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9286717B2 (en) * | 2013-07-30 | 2016-03-15 | Hewlett-Packard Development Company, L.P. | 3D modeling motion parameters |
-
2017
- 2017-09-28 CN CN201710897413.8A patent/CN109579794B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622762A (en) * | 2011-01-31 | 2012-08-01 | 微软公司 | Real-time camera tracking using depth maps |
CN104395932A (en) * | 2012-06-29 | 2015-03-04 | 三菱电机株式会社 | Method for registering data |
CN104050712A (en) * | 2013-03-15 | 2014-09-17 | 索尼公司 | Method and apparatus for establishing three-dimensional model |
CN105453559A (en) * | 2013-04-16 | 2016-03-30 | 点积公司 | Handheld portable optical scanner and method of using |
WO2015005577A1 (en) * | 2013-07-09 | 2015-01-15 | 삼성전자 주식회사 | Camera pose estimation apparatus and method |
CN105989604A (en) * | 2016-02-18 | 2016-10-05 | 合肥工业大学 | Target object three-dimensional color point cloud generation method based on KINECT |
Non-Patent Citations (1)
Title |
---|
ICP-based pose-graph SLAM;Ellon Mendes等;《2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)》;20161031;第195-200页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109579794A (en) | 2019-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10296798B2 (en) | System and method of selecting a keyframe for iterative closest point | |
CN110945565B (en) | Dense visual SLAM with probability bin map | |
CN108700946B (en) | System and method for parallel ranging and mapping fault detection and recovery | |
KR101725060B1 (en) | Apparatus for recognizing location mobile robot using key point based on gradient and method thereof | |
Audras et al. | Real-time dense appearance-based SLAM for RGB-D sensors | |
US8644557B2 (en) | Method and apparatus for estimating position of moving vehicle such as mobile robot | |
US7599548B2 (en) | Image processing apparatus and image processing method | |
CN111462207A (en) | RGB-D simultaneous positioning and map creation method integrating direct method and feature method | |
Chien et al. | Visual odometry driven online calibration for monocular lidar-camera systems | |
WO2005043466A1 (en) | Estimation system, estimation method, and estimation program for estimating object state | |
KR20150144727A (en) | Apparatus for recognizing location mobile robot using edge based refinement and method thereof | |
Lowe et al. | Complementary perception for handheld slam | |
Zuñiga-Noël et al. | Automatic multi-sensor extrinsic calibration for mobile robots | |
Nobre et al. | Drift-correcting self-calibration for visual-inertial SLAM | |
Rehder et al. | Online stereo camera calibration from scratch | |
JP6922348B2 (en) | Information processing equipment, methods, and programs | |
Sünderhauf et al. | Towards using sparse bundle adjustment for robust stereo odometry in outdoor terrain | |
Melbouci et al. | Model based rgbd slam | |
CN109579794B (en) | System and method for selecting key frame by iterative closest point method | |
KR101766823B1 (en) | Robust visual odometry system and method to irregular illumination changes | |
TWI652447B (en) | System and method of selecting a keyframe for iterative closest point | |
KR102406240B1 (en) | Robust stereo visual inertial navigation apparatus and method | |
Dias et al. | Accurate stereo visual odometry based on keypoint selection | |
KR20220158628A (en) | Method and apparayus for depth-aided visual inertial odometry | |
KR20230049969A (en) | Method and apparatus for global localization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |