CN111666796B - Perceptibly moving key point selection system suitable for iteration closest point method - Google Patents

Perceptibly moving key point selection system suitable for iteration closest point method Download PDF

Info

Publication number
CN111666796B
CN111666796B CN201910177166.3A CN201910177166A CN111666796B CN 111666796 B CN111666796 B CN 111666796B CN 201910177166 A CN201910177166 A CN 201910177166A CN 111666796 B CN111666796 B CN 111666796B
Authority
CN
China
Prior art keywords
point
region
depth value
unit
key
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910177166.3A
Other languages
Chinese (zh)
Other versions
CN111666796A (en
Inventor
陈俊维
萧文远
谢明得
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Himax Technologies Ltd
NCKU Research and Development Foundation
Original Assignee
Himax Technologies Ltd
NCKU Research and Development Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Himax Technologies Ltd, NCKU Research and Development Foundation filed Critical Himax Technologies Ltd
Priority to CN201910177166.3A priority Critical patent/CN111666796B/en
Publication of CN111666796A publication Critical patent/CN111666796A/en
Application granted granted Critical
Publication of CN111666796B publication Critical patent/CN111666796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A system for selecting perceptibly moving key points applicable to an iterative closest point method comprises a pruning unit receiving an image for selecting at least one region of interest, the region of interest comprising a subset of points selected from the image; a point quality estimation unit which generates a point quality of each point in the region of interest according to the frame speed; and a suppression unit receiving the point quality, thereby screening the region of interest to generate a keypoint.

Description

Perceptibly moving key point selection system suitable for iteration closest point method
Technical Field
The present invention relates to an iterative closest point method, and more particularly, to a system for selecting a perceptibly moving key point suitable for an iterative closest point method.
Background
Iterative Closest Point (ICP) can be used to reduce the gap between two clusters of point clusters. Wherein the target (or reference) population remains fixed and the source population is transformed to match the target population.
Iterative Closest Point (ICP) methods are used in visual odometry (visual) to determine the position and orientation of a robot in a wide variety of robotic applications. Iterative closest point methods are typically used to reconstruct two-or three-dimensional surfaces or to position robots for optimal path planning. The iterative closest point method iteratively modifies the transformations (e.g., translation and rotation) to reduce errors, such as the distance of the matching set of coordinates between the source and target clusters.
The first step in the application of iterative closest point method is usually the detection of keypoints (keypoints), such as in synchronous localization and mapping (SLAM), which constructs or updates a map of the unknown environment and simultaneously tracks the position of the robot or performs visual tracking, whose robustness (robustness) and accuracy are affected by keypoint detection.
Conventional keypoint detectors have a relatively high computational complexity because they use all the points to perform the iterative closest point method algorithm. Furthermore, the performance of the iterative closest point method is reduced due to the use of non-ideal feature pairs. Therefore, it is desirable to provide a novel key point selection technique to overcome the disadvantages of the conventional key point detector.
Disclosure of Invention
In view of the foregoing, it is a primary object of the present invention to overcome the drawbacks of the prior art and to provide a moving-perceivable key point selection system, which can be applied to Iterative Closest Point (ICP) to reduce the computational complexity and enhance the accuracy.
The purpose of the invention and the technical problem to be solved are realized by adopting the following technical scheme.
The system for selecting the perceptibly moving key points applicable to the iterative closest point method comprises a pruning unit, a point quality estimation unit and a suppression unit. The cropping unit receives the image to select at least one region of interest, the region of interest comprising a subset of points selected in the image. The point quality estimation unit generates a point quality of each point in the region of interest according to the frame speed. The suppression unit receives point quality from which the region of interest is screened to generate keypoints.
The object of the present invention and the technical problems solved thereby can be further achieved by the following technical measures.
In the foregoing system for selecting perceptibly moving key points applicable to the iterative closest point method, the pruning unit selects the near-edge region as the region of interest.
In the above-mentioned perceptually movable keypoint selection system applicable to the iterative closest point method, the at least one region of interest comprises two of the near-edge regions, wherein one of the near-edge regions is located on the left side of the blocked skipping region, and wherein the other near-edge region is located on the right side of the noise skipping region, wherein the blocked skipping region is adjacent to the left of the last valid pixel, and the noise skipping region is adjacent to the right of the current valid pixel.
In the aforementioned system for selecting perceptibly moving keypoints applicable to the iterative closest point method, each point of the image includes a color and a depth.
In the above-mentioned system for selecting perceptibly moving key points for the iterative closest point method, the pruning unit and the point quality estimation unit perform point-based operations.
In the aforementioned system for selecting a keypoint of a perceived movement applicable to the iterative closest point method, the suppressing unit performs an operation based on the frame.
In the above-mentioned motion-aware keypoint selection system for iterative closest point methods, the point quality estimation unit comprises a model selection unit for generating the keypoint value according to the frame speed.
In the above-mentioned system for selecting a keypoint of a perceivable movement applicable to the iterative closest point method, the model selecting unit comprises a look-up table, and a key depth value of the frame speed is obtained according to the look-up table.
In the aforementioned system for selecting a keypoint with perceivable movement applicable to the iterative closest point method, the greater the frame speed, the greater the corresponding key depth value.
In the above-mentioned system for selecting a perceptibly moving key point applicable to the iterative closest point method, the point quality estimation unit comprises an estimation model unit, which receives the key depth value to establish a relationship curve between the generated point quality and the depth value, thereby obtaining the corresponding point quality of a point in the region of interest.
In the above-mentioned system for selecting a keypoint of perceived movement suitable for the iterative closest point method, the relationship between the point quality and the depth value is stored in a lookup table.
In the aforesaid system for selecting a keypoint of a perceivable movement suitable for the iterative closest point method, the estimation model unit performs the following steps: receiving the key depth value; taking the key depth value as the vertex of the relation curve, wherein the corresponding point quality is the maximum point quality; the preset function is used, and the maximum point quality is used as the top point of the relation curve to establish the relation curve for generating the point quality and the depth value, so that each depth value can correspondingly obtain the point quality.
In the above system for selecting perceptibly moving key points suitable for the iterative closest point method, the predetermined function is a gaussian function.
Compared with the prior art, the invention has obvious advantages and beneficial effects. Through the technical scheme, the detectable moving key point selection system applicable to the iteration closest point method can achieve considerable technical progress and practicability, has wide industrial utilization value, and at least has the advantages of reducing calculation complexity, enhancing accuracy and the like.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following preferred embodiments are specifically described below with reference to the accompanying drawings.
Drawings
FIG. 1 is a block diagram of a system for selecting perceptually moving keypoints suitable for use in an iterative closest point method according to the present invention.
FIG. 2A is a depth image of a row of a perceptually moving keypoint selection system for iterative closest point methods in accordance with the present invention.
FIG. 2B is a block-skipped area, a noise-skipped area, and a near-edge area determined by the clipping unit of FIG. 1 in the depth image according to the perceptibly moving keypoint selection system of the iterative closest point method of the present invention.
FIG. 3 is a block diagram of the point quality estimation unit of FIG. 1 of the perceptually moving keypoint selection system of the iterative closest point method of the present invention.
Fig. 4A-4C are quality-depth curves for different frame speeds for a perceptually moving keypoint selection system for iterative closest point methods in accordance with the present invention.
[ notation ] to show
100. Key point selection system capable of perceiving movement
11. Trimming unit
12. Point mass estimation unit
121. Model selection unit
122. Estimation model unit
13. Suppression unit
q n Last effective pixel
q c Currently active pixel
Near edge region of NER
OSR blocked skip area
NSR noise skip area
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the key point selection system for perceivable movement by the iterative closest point method, its specific embodiments, structures, methods, steps, features and effects thereof according to the present invention will be provided with reference to the accompanying drawings and preferred embodiments.
The foregoing and other technical and scientific aspects, features and advantages of the present invention will be apparent from the following detailed description of preferred embodiments, which is to be read in connection with the accompanying drawings. While the present invention has been described in connection with the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
Fig. 1 is a block diagram of a motion-aware keypoint selection system (hereinafter referred to as keypoint selection system) 100 suitable for use in an Iterative Closest Point (ICP) approach. The blocks of the keypoint selection system 100 may be implemented using software, hardware, or a combination thereof, and may be performed using a digital image processor.
In an embodiment, the keypoint selection system 100 may be applied to Augmented Reality (AR) devices. The hardware components of an augmented reality device primarily include a processor (e.g., an image processor), a display (e.g., a head mounted display apparatus), and sensors (e.g., a color-depth camera, such as a red, green, blue, and depth-available red, green, blue-depth camera). Wherein the sensor or camera captures a scene to generate an image frame (frame) which is then transmitted to the processor for operation of the keypoint selection system 100, thereby generating an augmented reality on the display.
In this embodiment, the keypoint selection system 100 may comprise a pruning unit 11, which receives the image, and selects at least one region of interest (ROI) after the screening process, which includes a subset of selected points (or pixels) in the image. Points outside the region of interest are discarded to simplify the processing of the keypoint selection system 100 and to reduce computational complexity significantly, but not significantly to reduce accuracy. Each point of the image may include color (e.g., red, green, blue) and depth. The clipping unit 11 of the present embodiment performs a point-based operation.
According to one of the features of the present embodiment, the trimming unit 11 selects a Near Edge Region (NER) as the region of interest. FIG. 2A illustrates a depth image of a row, where q is n Represents the last valid (or background) pixel (otherwise known as blocked edge), and q c Representing the currently valid (or foreground) pixel (otherwise known as the occlusion edge). Fig. 2B shows the clipping unit 11 of fig. 1 in the Near Edge Region (NER) (i.e., the attention region), the blocked skipping region (OSR) and the Noise Skipping Region (NSR) determined by the depth image according to the embodiment of the present invention. The blocked-from-skipping region (OSR) is adjacent to the last active pixel q n And a Noise Skipping Region (NSR) adjacent to the current active pixel q c To the right of (c). In which the Noise Skipping Region (NSR) usually includes a plurality of (e.g. 12) pixels, and the normal (normal) is difficult to estimate corresponding to the boundary or corner, so the present embodiment discards the Noise Skipping Region (NSR). The blocked skip area (OSR) usually comprises a plurality of (e.g. 2) pixels, corresponding to the blocked area, which does not have a correct correspondence (coreespondance) in the target frame, so the present embodiment discards the blocked skip area (OSR). The Near Edge Region (NER) typically comprises a plurality of pixels, which, because they contain useful information, are located on the left side of the blocked skip region (OSR) and on the right side of the Noise Skip Region (NSR)The region (NER) is selected as the region of interest. In an embodiment, the pixel widths of the blocked skip region (OSR), the Noise Skip Region (NSR), and the Near Edge Region (NER) may be preset values.
The keypoint selection system 100 of the present embodiment may comprise a point quality estimation (point quality estimation) unit 12 that generates a point quality for each point within the region of interest based on the frame velocity, thus forming a motion-aware keypoint selection system 100. The point-based (point-based) operation is performed by the point quality estimation unit 12 of the present embodiment.
In an embodiment, the Noise model used by the saliency function (saliency function) of the point quality estimation unit 12 is disclosed in "modified three-dimensional Reconstruction and Tracking kenyant Sensor Noise model for Improved 3D Reconstruction and Tracking" by raney (c.v. nguyen) "et al, and is disclosed in 2012 second international seminar of three-dimensional image, model, processing, vision and Transmission (3D imaging, modeling, processing, visualization & Transmission), the details of which are considered to be part of this specification.
Fig. 3 shows a detailed block diagram of the point mass estimation unit 12 of fig. 1. The point quality estimation unit 12 may comprise a model selection unit 121 for generating key depth values according to the frame speed. In an embodiment, the model selecting unit 121 includes an experimentally obtained lookup table (lookup table) to obtain a corresponding key depth value of the frame speed. The frame rate can be obtained by a speedometer (speedometer) or an Inertial Measurement Unit (IMU).
FIGS. 4A-4C are quality-depth curves for different frame speeds for a perceptually moving keypoint selection system using iterative closest point approach, wherein the depth values corresponding to the curve vertices represent the key-depth values corresponding to the frame speed. As illustrated in fig. 4A to 4C, the larger the frame speed, the larger the corresponding key depth value.
Returning to fig. 3, the point quality estimation unit 12 may comprise an estimation model unit 122, which receives the key depth values (of the model selection unit 121) to establish a relationship curve of the generated point quality and the depth values, thereby obtaining a corresponding point quality of a point of the region of interest. In an embodiment, the relationship between the point quality and the depth value may be stored in a look-up table (lookup table). In detail, as illustrated in fig. 4A (with a frame speed of 0.000922), the estimation model unit 122 receives the key depth value (e.g., about 60 cm) from the model selection unit 121. Then, the estimation model unit 122 uses the key depth value as the top point of the relation curve, and the corresponding point quality is 1 (i.e. the maximum point quality). Next, the estimation model unit 122 uses a predetermined function (e.g., gaussian function) and uses the maximum point quality as the vertex of the relation curve to establish a relation curve (e.g., gaussian relation curve) for generating the point quality and the depth value, which has a predetermined distribution (e.g., gaussian or normal distribution). Therefore, each depth value can be associated with a point quality. For other frame speeds (as illustrated in fig. 4B and 4C), a relation curve or a look-up table between the point quality and the depth value can be obtained according to the above principle, so as to obtain the corresponding point quality of a certain point in the attention area.
Returning to fig. 1, the keypoint selection system 100 may comprise a suppression (suppression) unit 13 that receives the point quality (of the point quality estimation unit 12) and accordingly performs a further screening process on a plurality of points of the region of interest (of the pruning unit 11) to generate keypoints that are uniformly distributed without clustering. Since fewer key points are used to cover the image, the computation is accelerated. The suppressing unit 13 of the present embodiment performs frame-based operation.
In the embodiment, the suppression unit 13 uses a non-maximum suppression (NMS) algorithm, details of which are disclosed in "Multi-image matching using Multi-scale oriented tiles (Multi-image oriented tiles)" proposed by brown (m.brown), published in the institute of electrical and electronics engineers Computer Vision and graphics Recognition Conference (IEEE Computer Society Vision and Pattern Recognition) in 2005, and "effective adaptive non-maximum suppression algorithm for average spatial key point distribution" (published in the experimental adaptation of graphics for spatial key points), section 13 of which is regarded as the description of the graph Recognition page, volume 13.
Other operable embodiments of the present invention may be modified within the technical field of the present invention as long as they have the most basic knowledge. In the present invention, a patent is claimed for the essential technical solution, and the protection scope of the patent should include all the changes with the technical characteristics.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (7)

1. A system for perceptibly selecting a moving keypoint for an iterative closest point approach, comprising:
a cropping unit receiving the image to select at least one region of interest, the region of interest comprising a subset of points selected in the image;
a point quality estimation unit for generating a point quality of each point in the region of interest according to the frame velocity; and
a suppression unit for receiving the point quality, and accordingly screening the concerned area to generate a key point;
the point mass estimation unit includes:
a model selection unit for generating a key depth value according to the frame speed; and
an estimation model unit, which receives the key depth value to establish a relation curve between the generated point quality and the depth value, thereby obtaining the corresponding point quality of a certain point of the attention area;
the estimation model unit performs the steps of:
receiving the key depth value;
taking the key depth value as the top point of the relation curve, wherein the point quality corresponding to the key depth value is the maximum point quality; and
the preset function is used, and the maximum point quality is used as the top point of the relation curve to establish the relation curve for generating the point quality and the depth value, so that each depth value can correspondingly obtain the point quality.
2. The system of claim 1, wherein the pruning unit selects a near-edge region as the region of interest.
3. The system of claim 2, wherein the at least one region of interest comprises two of the near-edge regions, one of the near-edge regions is located to the left of a blocked skipping region, the other of the near-edge regions is located to the right of a noise skipping region, the blocked skipping region is adjacent to the left of the last active pixel, and the noise skipping region is adjacent to the right of the current active pixel.
4. The system of claim 1, wherein each point of the image comprises a color and a depth.
5. The system of claim 1, wherein the model selection unit comprises a lookup table for obtaining key depth values of the frame rate.
6. The system of claim 1, wherein the larger the frame rate, the larger the corresponding key-depth value.
7. The system of claim 1, wherein the point quality versus depth value relationship is stored in a look-up table.
CN201910177166.3A 2019-03-08 2019-03-08 Perceptibly moving key point selection system suitable for iteration closest point method Active CN111666796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910177166.3A CN111666796B (en) 2019-03-08 2019-03-08 Perceptibly moving key point selection system suitable for iteration closest point method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910177166.3A CN111666796B (en) 2019-03-08 2019-03-08 Perceptibly moving key point selection system suitable for iteration closest point method

Publications (2)

Publication Number Publication Date
CN111666796A CN111666796A (en) 2020-09-15
CN111666796B true CN111666796B (en) 2023-04-07

Family

ID=72381381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910177166.3A Active CN111666796B (en) 2019-03-08 2019-03-08 Perceptibly moving key point selection system suitable for iteration closest point method

Country Status (1)

Country Link
CN (1) CN111666796B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201243974A (en) * 2011-03-25 2012-11-01 Kla Tencor Corp Methods and apparatus for optimization of inspection speed by generation of stage speed profile and selection of care areas for automated wafer inspection
CN105528061A (en) * 2014-09-30 2016-04-27 财团法人成大研究发展基金会 Gesture recognition system
CN105809676A (en) * 2016-03-03 2016-07-27 魏晓峰 Image processing method and image processing apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140029808A1 (en) * 2012-07-23 2014-01-30 Clicrweight, LLC Body Condition Score Determination for an Animal
WO2017044782A1 (en) * 2015-09-11 2017-03-16 EyeVerify Inc. Image and feature quality, image enhancement and feature extraction for ocular-vascular and facial recognition, and fusing ocular-vascular with facial and/or sub-facial information for biometric systems
WO2018129715A1 (en) * 2017-01-13 2018-07-19 浙江大学 Simultaneous positioning and dense three-dimensional reconstruction method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201243974A (en) * 2011-03-25 2012-11-01 Kla Tencor Corp Methods and apparatus for optimization of inspection speed by generation of stage speed profile and selection of care areas for automated wafer inspection
CN105528061A (en) * 2014-09-30 2016-04-27 财团法人成大研究发展基金会 Gesture recognition system
CN105809676A (en) * 2016-03-03 2016-07-27 魏晓峰 Image processing method and image processing apparatus

Also Published As

Publication number Publication date
CN111666796A (en) 2020-09-15

Similar Documents

Publication Publication Date Title
EP3698323B1 (en) Depth from motion for augmented reality for handheld user devices
Mitrokhin et al. EV-IMO: Motion segmentation dataset and learning pipeline for event cameras
JP6126809B2 (en) Egomotion estimation system and method
US20180300937A1 (en) System and a method of restoring an occluded background region
US20190251738A1 (en) System and method for infinite synthetic image generation from multi-directional structured image array
KR100544677B1 (en) Apparatus and method for the 3D object tracking using multi-view and depth cameras
EP2157800B1 (en) Method and apparatus for increasing the frame rate of a video signal
TWI536318B (en) Depth measurement quality enhancement
US11348267B2 (en) Method and apparatus for generating a three-dimensional model
KR101634562B1 (en) Method for producing high definition video from low definition video
GB2580691A (en) Depth estimation
JP2003018604A (en) Image signal encoding method, device thereof and recording medium
EP2992508A1 (en) Diminished and mediated reality effects from reconstruction
US20180350087A1 (en) System and method for active stereo depth sensing
US11189042B2 (en) Information processing device, information processing method, and computer program
US20090285544A1 (en) Video Processing
Moustakas et al. Stereoscopic video generation based on efficient layered structure and motion estimation from a monoscopic image sequence
KR101125061B1 (en) A Method For Transforming 2D Video To 3D Video By Using LDI Method
US9208549B2 (en) Method and apparatus for color transfer between images
JP6908025B2 (en) Image processing equipment and image processing method
CN111666796B (en) Perceptibly moving key point selection system suitable for iteration closest point method
US10803343B2 (en) Motion-aware keypoint selection system adaptable to iterative closest point
Kovačević et al. An improved CamShift algorithm using stereo vision for object tracking
TWI714005B (en) Motion-aware keypoint selection system adaptable to iterative closest point
US8179967B2 (en) Method and device for detecting movement of an entity provided with an image sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant