CN112348032A - SIFT algorithm key point detection method based on hardware circuit - Google Patents

SIFT algorithm key point detection method based on hardware circuit Download PDF

Info

Publication number
CN112348032A
CN112348032A CN201910736007.2A CN201910736007A CN112348032A CN 112348032 A CN112348032 A CN 112348032A CN 201910736007 A CN201910736007 A CN 201910736007A CN 112348032 A CN112348032 A CN 112348032A
Authority
CN
China
Prior art keywords
point
pixel
detected
points
extreme
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910736007.2A
Other languages
Chinese (zh)
Other versions
CN112348032B (en
Inventor
赵旺
肖刚军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN201910736007.2A priority Critical patent/CN112348032B/en
Publication of CN112348032A publication Critical patent/CN112348032A/en
Application granted granted Critical
Publication of CN112348032B publication Critical patent/CN112348032B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a hardware circuit-based SIFT algorithm key point detection method, which comprises the following steps: by means of the mode that the register collects multi-scale image layers and multi-point image pixels at one time, image layers and pixel points required by constructing a difference Gaussian scale space are unified in a key point detection space, and accurate key points are determined by extreme point detection and two times of extreme point accurate positioning of the extreme points, so that complexity of obtaining regional information of points to be detected is greatly reduced. Compared with the prior art, the technical scheme effectively reduces the time for data acquisition and analysis in the extreme point detection process, and reduces the complexity of the hardware flow, thereby improving the efficiency of the algorithm, saving the circuit area of the hardware, and improving the detection real-time performance of key points of the SIFT algorithm.

Description

SIFT algorithm key point detection method based on hardware circuit
Technical Field
The invention relates to the technical field of image processing, in particular to a hardware circuit-based SIFT algorithm key point detection method.
Background
SIFT, Scale-invariant feature transform (SIFT), is a description used in the field of image processing. The description has scale invariance, can detect key points in the image and is a local feature descriptor. The SIFT feature is based on some locally apparent points of interest on the object, regardless of the size and rotation of the image. The tolerance to light, noise, and micro-viewing angle changes is also quite high. The method is widely applied to the fields of video tracking, image three-dimensional modeling, object recognition, image panoramic stitching and the like.
Due to the fact that the SIFT algorithm is large in calculation amount, in practical application, along with the improvement of the performance of a camera, the resolution of images is higher and higher, the information content contained in each image is larger and larger, data needing to be processed is greatly increased, the image processing is difficult to achieve by only using software, and the real-time requirement is difficult to meet. Therefore, many articles propose to use the high-speed parallel computing capability of the hardware circuit and adopt the high-speed parallel architecture design for the SIFT algorithm to meet the real-time requirement.
In the process of executing the SIFT algorithm, the pixel information of the difference gaussian scale space is stored in the SRAM inside the chip, the position of the interpolation center needs to be repeatedly changed in order to obtain a more accurate extreme point, and the address management for reading the SRAM after changing the interpolation center becomes complicated. Under the existing logic control condition of reading and writing the address of the SRAM, the complexity of hardware design is increased and the efficiency of the algorithm is reduced by data acquisition and analysis for many times, and certain influence is exerted on the real-time performance of the algorithm.
Therefore, how to simply and efficiently detect the key points, reduce the complexity of hardware design, reduce the area of a hardware circuit and improve the real-time performance becomes a technical difficulty for realizing the SIFT algorithm key point detection for the hardware circuit.
Disclosure of Invention
Aiming at the technical problems, the invention provides a method for supporting multi-layer and multi-point unified detection of points to be detected in a hardware-based differential Gaussian scale space, which has the following specific technical scheme:
a hardware circuit-based SIFT algorithm key point detection method comprises the following steps: step 1, synchronously collecting all pixel points of each image layer in a group of difference Gaussian scale spaces, wherein the pixel points of the adjacent 3 image layers in the middle of the group of difference Gaussian scale spaces comprise the same number of pixel points to be detected and neighborhood comparison points thereof; step 2, obtaining and updating pixel information of the pixel point to be detected and the neighborhood comparison point thereof cached in the register; step 3, establishing a target pixel detection area taking the pixel point to be detected as the center according to the pixel information of the pixel point to be detected and the neighborhood comparison points thereof; step 4, judging whether the pixel point to be detected is an extreme point in the target pixel detection area or not by comparing the magnitude relation of the pixel information of the pixel point to be detected and the neighborhood comparison points thereof; step 5, entering step 7 when the pixel point to be detected is not the extreme point in step 4; step 6, when the pixel point to be detected is detected to be the extreme point in the step 4, limiting an iteration mode of subsequent extreme point accurate positioning according to the extreme point offset obtained by accurately positioning the extreme point, enabling the pixel coordinate information of the pixel point to be detected to finish offset correction according to the extreme point offset in the iteration, and removing invalid extreme points to obtain key points; the pixel coordinate information comprises pixel coordinates of any coordinate dimension in a difference Gaussian scale space; and 7, judging whether the to-be-detected pixel points of all image layers in the current group of difference Gaussian scale spaces complete the key point detection, if so, returning to the step 1 to collect the pixel points of all image layers in the next group of difference Gaussian scale spaces, otherwise, traversing the to-be-detected pixel points of the to-be-detected pixel points participating in the steps 1 to 6 in the neighborhood from the register, and returning to the step 2.
Compared with the prior art, the technical scheme has the advantages that the image levels and the pixel points thereof required for constructing the difference Gaussian scale space are unified in a key point detection space in a mode that the register collects multi-scale image layers and multi-point image pixels at one time, so that the complexity of hardware address taking operation is reduced; the accurate key point is determined by extreme point detection and twice extreme point accurate positioning of the extreme point, and the complexity of acquiring the regional information of the point to be detected is greatly reduced. Therefore, the technical scheme effectively and greatly simplifies the data acquisition process and the address management complexity, and reduces the hardware process complexity, thereby improving the algorithm efficiency, saving the hardware circuit area and improving the SIFT algorithm key point detection real-time performance.
Further, the number of image layers in the set of difference gaussian scale spaces is 5, and the pixel points to be detected are distributed in 3 adjacent image layers in the middle of the set of difference gaussian scale spaces, so that the target pixel detection area occupies 3 adjacent layers in the set of difference gaussian scale spaces. The technical scheme creates a neighborhood comparison space with 3 dimensions for detecting the spatial local extreme points in the same group of difference Gaussian scale spaces.
Further, the step 4 specifically includes: comparing the pixel value of the pixel point to be detected with the pixel values of 8 adjacent comparison points of the same-scale image layer, simultaneously comparing the pixel value of the pixel point to be detected with the pixel values of 9 multiplied by 2 corresponding adjacent comparison points of upper and lower adjacent-scale image layers, and when the pixel value of the pixel point to be detected is the maximum value or the minimum value, determining that the pixel point to be detected is the extreme value point in the target pixel detection area; the pixel value is a ternary function value formed by coordinates (x, y) of the pixel point to be detected and a neighborhood comparison point thereof and a scale sigma of the image layer, wherein the pixel coordinate of any coordinate dimension comprises pixel coordinates on an x axis, a y axis and a sigma axis; the neighborhood comparison points comprise adjacent comparison points of the pixel points to be detected on the same scale image layer and corresponding adjacent comparison points of the upper and lower adjacent scale image layers. According to the technical scheme, the pixel points to be detected and the neighborhood comparison points are compared one by one, and the pixel points to be detected are determined as the currently obtained extreme points in real time, so that the detection accuracy of the extreme points is improved, and the extreme points cannot be lost.
Further, the specific method of step 6 includes: in the target pixel detection area, taking the extreme point detected in the step 3 as an interpolation center position, and then obtaining the offset of the extreme point through one-time extreme point accurate positioning, wherein the offset of the extreme point comprises coordinate offsets of the extreme point on an x axis, a y axis and a sigma axis; when the coordinate offset included by the extreme point offset is smaller than the preset parameter, removing the extreme points with low contrast and edge response, and determining the pixel point to be detected as the key point of the image layer; when the coordinate offset which is greater than or equal to the first empirical value exists in the coordinate offsets included in the extreme point offset, abandoning to continue detecting the extreme point detected in the step 3, and returning to the step 7; wherein the first empirical value is greater than the preset parameter; when the coordinate offset included by the extreme point offset is smaller than a first experience value and the coordinate offset which is larger than or equal to the preset parameter exists, correcting the pixel coordinate information of the pixel point to be detected according to the extreme point offset, so that the pixel point to be detected is offset to the position of a new interpolation center; then, acquiring pixel information of a neighborhood comparison point of the pixel to be detected after pixel coordinate information is corrected, establishing a new target pixel detection area by taking the pixel to be detected which is shifted to a new interpolation center as a center, performing one-time extreme point accurate positioning in the new target pixel detection area to acquire a new extreme point offset, judging whether the coordinate offsets included in the new extreme point offset are all smaller than the preset parameter, if so, removing the extreme points with low contrast and edge response, then determining the corrected pixel to be detected after removing the invalid extreme point as a key point of the image layer, and if not, returning to the step 7; wherein the extreme point of low contrast and edge response is the invalid extreme point. The technical scheme is that the offset of the extreme point is converged through two times of accurate positioning of the extreme point, so that the pixel coordinate information processed by hardware is an integer; meanwhile, the matching relation between the performance of the algorithm and the area of the hardware circuit is fully considered, and the time for detecting and processing the next detection pixel or the pixel in the next group of difference Gaussian scale space is also determined.
Further, the method for accurately positioning the extreme point comprises the following steps: and when the pixel point to be detected is determined to be the extreme point, controlling the extreme point to perform Taylor expansion at the position of the interpolation center, then solving a Taylor expansion formula, and obtaining the offset of the pixel point to be detected relative to the interpolation center. According to the technical scheme, the position of the extreme point is detected and corrected according to the numerical value of the offset of the extreme point, when the offset of the extreme point meets a certain offset condition of an interpolation center, curve fitting is carried out through one-time Taylor expansion to find a more suitable extreme point so as to reduce the offset of the interpolation center, and whether a new extreme point is an accurate key point of an image layer is finally determined.
Further, the preset parameter is set to 0.5, and the first experience value is set to 1.5. The technical scheme sets the parameters as a parameter basis for measuring the degree of the extreme point to be deviated to the adjacent scale space and limiting the allowable error range of the extreme point to be deviated from the interpolation center so as to form the accurate key point.
Further, in step 6, the specific method for traversing the pixel to be detected on the neighborhood of the pixel to be detected, which has participated in steps 1 to 6, from the register is as follows: traversing a comparison point register, namely traversing a to-be-detected pixel point adjacent to the to-be-detected pixel point participating in the steps 1 to 6 on the same-scale image layer, determining the to-be-detected pixel point as a next to-be-detected pixel point of the same-scale image layer, outputting pixel information of the next to-be-detected pixel point of the same-scale image layer cached by the comparison point register to a detection point register, outputting the to-be-detected pixel point which is originally cached by the detection point register and participates in the steps 1 to 6 to the comparison point register, and determining the to-be-detected pixel point which participates in the steps 1 to 6 as a neighborhood comparison point of the next to-be-detected pixel point, wherein the adjacent to-be-detected pixel point of the same-scale image layer is also the neighborhood comparison point; or traversing a comparison point register, traversing adjacent to-be-detected pixel points of the to-be-detected pixel points participating in the steps 1 to 6 on an upper and lower adjacent scale image layer, determining the adjacent to-be-detected pixel points as next to-be-detected pixel points of the adjacent scale image layer, outputting pixel information of the next to-be-detected pixel points of the adjacent scale image layer cached by the comparison point register to a detection point register, outputting the to-be-detected pixel points participating in the steps 1 to 6 originally cached by the detection point register to the comparison point register, and determining the to-be-detected pixel points participating in the steps 1 to 6 as neighborhood comparison points of the next to-be-detected pixel point, wherein the adjacent to-be-detected pixel points of the adjacent scale image layer are also the neighborhood comparison points of the to-be-detected pixel points; wherein the registers include a compare point register and a detect point register. According to the technical scheme, the complexity of acquiring the regional information of the point to be detected from the memory is reduced, so that the algorithm efficiency is improved, the hardware circuit area is saved, and the detection real-time performance of the key points of the SIFT algorithm is improved.
Drawings
Fig. 1 is a flowchart of a method for detecting key points of a SIFT algorithm based on a hardware circuit according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of distribution of pixel points for extreme point detection in a target pixel detection area according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of distribution of pixel points for detecting a keypoint in a set of difference gaussian scale spaces according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings in the embodiments of the present invention. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
The embodiment of the invention provides a hardware circuit-based SIFT algorithm key point detection method, which comprises the following steps of:
step S101, controlling a register to synchronously acquire all pixel points on each image layer in a group of difference Gaussian scale spaces, ensuring that the corresponding register is arranged on the pixel point on each image layer in the group of difference Gaussian scale spaces to execute synchronous acquisition operation, and then entering step S102. The pixel points include pixel points to be detected and neighborhood comparison points thereof on all image layers in a set of difference gaussian scale spaces, the pixel points to be detected of each image layer synchronously enter and exit corresponding registers, and the image to be detected of the embodiment is not limited to constructing a set of difference gaussian scale spaces.
Preferably, the total number of layers of the image layer of the current group of difference gaussian scale space is 5, 3 layers adjacent in the middle exist to be detected pixel points, each layer has 4 to be detected pixel points, 4 to be detected pixel points in each layer of the adjacent 3 layers of image layer can be synchronously collected through a corresponding register every time, and therefore 3 layers can be collected at one time, and key point detection is carried out on 4 to be detected pixel points in each layer.
In all image levels corresponding to the set of difference gaussian scale spaces collected in step S101, in this embodiment, S is set as the number of gaussian difference image levels used for searching for pixel points in the current set of difference gaussian scale spaces, and the search for pixel points is actually performed in the search space for local minimum values, namely, when searching a local extreme point in a certain layer, the Gaussian difference image of the upper layer and the Gaussian difference image of the next layer are needed, and the subsequent accurate positioning of the extreme point can cause the extreme point to shift to the position of the neighborhood comparison point, therefore, in the process of searching key points of the S-layer Gaussian difference image layer, the S + 2-layer Gaussian difference image layer is needed, and then the 2 nd-layer Gaussian difference image is searched to the S +1 st-layer Gaussian difference image, wherein, the number of all image layers corresponding to the group of difference gaussian scale spaces acquired in step S101 is S + 2.
As shown in fig. 3, in this embodiment, the value of S is set to 3, 5 layers of gaussian difference images are required for each difference gaussian scale space, and the pixel points to be detected exist in the middle 3 layers (2 layers, 3 layers, and 4 layers). For 3 image layers adjacent to each other in the middle in the current set of difference gaussian scale space, this embodiment sets 4 points per layer to perform key point detection, that is, key point detection is performed on the pixel points marked by black dots in the 2 nd layer, the 3 rd layer, and the 4 th layer, and the pixel points marked by the black dots are the pixel points to be detected in this embodiment. When the resolution of the image to be detected is 640x480, key point detection is performed on 12 points in the middle 3 layers in a group of difference gaussian scale spaces (4 points participate in each layer) each time, which is equivalent to processing 4 pixels in a row of the same image to be detected, 640/4=160 groups of difference gaussian scale spaces are required by column traversal, and then 480 groups of difference gaussian scale spaces are required by row traversal, so that when 4x3=12 pixels for subsequent extreme point detection are taken out from the register at one time in an image layer with 3 layers of adjacent scales, the key points of the image to be detected with the resolution of 640x480 can be detected by performing 160x480 times (without considering the subsequent extreme point offset). In the prior art, when pixel points of an image to be detected with 640x480 resolution are read from an SRAM memory to perform SIFT algorithm key point detection, key point detection can be performed only on 1 point of one layer in a group of difference Gaussian scale space every time, and then 640x480 x3 times are required to be performed. Thus, the present embodiment significantly reduces the number of data acquisitions and analyses compared to the prior art.
The number of layers of all image layers in the set of difference Gaussian scale space is 5, the pixel points are distributed in 3 adjacent image layers in the middle of the set of difference Gaussian scale space, and the target pixel detection area occupies any adjacent 3 layers in the set of difference Gaussian scale space. The technical scheme is that a neighborhood comparison space with 3 dimensions is created for detecting spatial local extreme points in the same set of difference Gaussian scale space, and various offset position characteristics of the extreme points after the extreme points are accurately positioned are considered in the target pixel detection area.
Step S102, pixel information of the pixel point to be detected and the neighborhood comparison point thereof cached in the register is obtained and updated, and then step S103 is carried out. In the subsequent key point precision process, if the coordinate offset reaches 1 pixel unit, the pixel point to be detected can be offset to the area where the white dot marked pixel point is located in the 2 nd layer, the 3 rd layer or the 4 th layer, and at the moment, the pixel point to be detected needs to carry out extreme point detection at a new coordinate position, so that the pixel information of the pixel point to be detected and the neighborhood comparison point cached in the register is updated.
Step S103, establishing a target pixel detection area with the pixel point to be detected as the center according to the pixel information of the pixel point to be detected and the neighborhood comparison point of the pixel point to be detected, and then entering the step S103; the search of the extreme point in the subsequent step S103 needs to be performed not only in the neighborhood of the scale space image where it is located, but also in the adjacent scale space. And comparing the size of each pixel point to be detected with 8 adjacent points of the same-scale image layer and 9 × 2 points corresponding to upper and lower adjacent-scale image layers, wherein the pixel values of 8+9 × 2=26 points are compared to finish extreme point detection in the subsequent steps, so that the target pixel detection area forms a cubic area of 3x3x 3.
In this embodiment, the pixel points marked by the white dots in the 2 nd, 3 rd or 4 th layers are used as the neighborhood comparison points of the pixel points to be detected, and the pixels marked by the black dots are used as the pixel points to be detected. If the coordinate offset reaches 1 pixel unit, the pixel point to be detected can be offset to the pixel region marked by the white dot in the 2 nd layer, the 3 rd layer or the 4 th layer, so that the offset pixel point to be detected and the neighborhood comparison point thereof can be constructed into a cubic region of 6x3x 3. Since the target pixel detection area for performing extreme point detection is a cubic area of 3x3x3, that is, the number of layers of the gaussian difference image for searching for the pixel is 3, 5 layers of gaussian difference images are required for each set of difference gaussian scale space, and therefore, in this embodiment, when detecting the key point of the cubic area of 6x3x3, the pixel information included in the cubic area of 8x5x5 is required. Therefore, in this embodiment, in step S101, pixel information of a group of 8x5x5 difference gaussian scale space regions is collected at a time, so that keypoint detection is performed on 12 pixels in the 3 layers. The embodiment expands the range of data acquisition every time, not only obtains the regional pixel information required by one point to be detected, but also can detect key points of 3 layers and 4 points to be detected in each layer by data acquisition.
Step S104, judging whether the pixel point to be detected is an extreme point in the target pixel detection area or not according to the comparison relation of the pixel information of the pixel point to be detected and the neighborhood comparison point thereof, if so, entering step S105, otherwise, entering step S116. Step S104, determining an extreme point in a target pixel detection area by comparing the magnitude relation of pixel information of a pixel to be detected and a neighborhood comparison point thereof, specifically comparing the pixel value of the pixel to be detected with the pixel values of 8 adjacent comparison points of an image layer with the same scale, simultaneously comparing the pixel value of the pixel to be detected with the pixel values of 9 x 2 corresponding adjacent comparison points of an image layer with the upper and lower adjacent scales, and determining the pixel to be detected as the extreme point in the target pixel detection area when the pixel value of the pixel to be detected is the maximum value or the minimum value in the target pixel detection area; the pixel value is a ternary function value formed by the coordinates (x, y) of the pixel point and the neighborhood comparison point thereof and the scale sigma of the image layer in the differential Gaussian scale space; the neighborhood comparison points comprise adjacent comparison points of the pixel points to be detected on the same scale image layer and corresponding adjacent comparison points of the upper and lower adjacent scale image layers. In the embodiment, the pixel points to be detected and the neighborhood comparison points are compared one by one, and the pixel points to be detected are determined as the currently obtained extreme points in real time, so that the detection accuracy of the extreme points is improved, and the extreme points cannot be lost.
As shown in fig. 2, the pixel point No. 14 is marked as a pixel point to be detected, which means that the pixel point to be detected needs to be compared with the pixel points No. 1 to 27 except the pixel point No. 14. Therefore, for each pixel point to be detected, the pixel information of the target pixel detection area of 3x3x3 with the pixel point to be detected as the center needs to be acquired, and the pixel point information contained in the target pixel detection area of 3x3x3 is also just used for accurate positioning of the extreme point, low-contrast removal and edge response removal in the subsequent steps.
Step S105, in the target pixel detection area, determining the coordinate position of the extreme point detected in the step S104 as an interpolation center position, then obtaining the offset of the extreme point through one-time extreme point accurate positioning, and then entering the step S105. The extreme point offset comprises coordinate offsets on x, y and sigma coordinate axes of a difference Gaussian scale space, for coordinates (x, y, sigma), x and y represent position information of pixel points, sigma represents scale information of the pixel points, and the extreme point offset describes position distribution characteristics of the pixel points in the difference Gaussian scale space from different coordinate dimensions.
Step S106, determining whether a coordinate offset greater than or equal to the first empirical value exists in the coordinate offsets included in the extreme point offset obtained in step S105, if yes, abandoning to continue to perform extreme point accurate positioning on the extreme point detected in step S104, and directly entering step S116, otherwise, entering step S106. Therefore, in the algorithm hardware implementation process, the detection power consumption and hardware resources of a single pixel point are effectively reduced.
The position of the accurate point is required to be obtained in the key point accuracy process, the concept of the sub-pixel level is achieved, and the deviation is basically within 1 pixel point. Therefore, the algorithm provided by the present embodiment eliminates the extreme points whose extreme point offsets exceed the first empirical value, where the first empirical value is set to 1.5.
Step S107, determining whether the coordinate offsets included in the extreme point offset obtained in step S105 are all smaller than the preset parameter, that is, determining whether the extreme point offset is smaller than the preset parameter on the basis that the extreme point offset is smaller than the first checked value, if so, going to step S114, otherwise, going to step S108.
It should be noted that, in this embodiment, the preset parameter is set to 0.5, and in this embodiment, the preset parameter 0.5 is set as a measure of the degree of the extreme point shifting to the adjacent scale space, and when the coordinate shift amount on any coordinate axis of x, y, and σ is greater than or equal to 0.5, it indicates that the extreme point deviates from the interpolation center, and it can be regarded as shifting to the adjacent scale space; the preset parameter setting of 0.5 also defines the allowable error range of the extreme point from the interpolation center to form the parameter basis of the precise key point. When the coordinate offsets on the x, y, and σ coordinate axes are all less than 0.5, it means that the extreme point is not offset from the interpolation center. Since the precise point position obtained by the precise key point is a concept at the sub-pixel level, for finding the data after the decimal point of the pixel coordinate information more precisely, the interpolation center position needs to be changed within a limited number of times for calculation through the technical means of precise positioning of the extreme point disclosed in the step S105, so that the numerical value of the offset of the extreme point is processed into an integer, and the processing of the pixel information by the hardware resource is facilitated.
Step S108, when the coordinate offsets included in the extreme point offsets are smaller than the first experience value and the coordinate offsets larger than or equal to the preset parameter exist, correcting the pixel coordinate information of the pixel point to be detected according to the extreme point offsets of the coordinate position of the pixel point to be detected, enabling the pixel point to be detected to be offset to the position of the new interpolation center, and then entering step S109. In this embodiment, when the coordinate offset of the extreme point obtained in step S105 on any coordinate axis of x, y, and σ is 0.7, the coordinate offset is greater than the preset parameter 0.5 and smaller than the first empirical value 1.5, the pixel to be detected is controlled to be offset by 1 pixel unit to the neighborhood in the positive direction or the negative direction of the corresponding coordinate axis according to the coordinate offset 0.7, and after the offset of the pixel to be detected occurs, the pixel to be detected is offset to a new interpolation center, where the position of the new interpolation center may be a neighborhood comparison point of the pixel to be detected. And correcting the pixel coordinate information of the pixel point to be detected, and updating the pixel coordinate information cached in the register corresponding to the pixel point to be detected.
As shown in fig. 2, it is assumed that the 14 th pixel is the pixel to be detected and is detected as an extreme point, and the coordinate offset on any coordinate axis of x, y, and σ obtained after the extreme point is accurately located in the step S104 is 0.7, and is greater than 0.5, after the step S107 is executed, the 14 th pixel is offset to any position other than the position of the 14 th pixel in the coordinate positions of the 1 st to 27 th pixels, so that a new interpolation center is formed at the position, so as to perform the next extreme point accurate location, where the offset direction is related to the coordinate axis where the coordinate offset related to the extreme point offset is located.
Step S109, acquiring the pixel information of the neighborhood comparison point of the pixel to be detected after the pixel coordinate information is corrected, and establishing a new target pixel detection area by taking the pixel to be detected which is deviated to a new interpolation center as a center, wherein the pixel to be detected after the pixel coordinate information is corrected is determined as a new extreme point, and then the step S110 is carried out, and the extreme point in the step S110 is accurately positioned.
Step S110, in the new target pixel detection area established in step S109, determining the pixel point to be detected corrected in step S108 as a new interpolation center, obtaining a new extreme point offset through one-time extreme point accurate positioning, and then entering step S111. At this time, the interpolation center is changed, and the extreme point is processed iteratively by continuously using the extreme point accurate positioning.
Step S111, determining whether the coordinate offsets included in the new extreme point offset obtained in step S110 are all smaller than the preset parameter 0.5, if yes, entering step S112, meaning that after the extreme point offset obtained in step S105 for the first time is subjected to two times of extreme point precision positioning, the x, y, and coordinate offsets on a sigma coordinate axis included in the extreme point offset have converged to be smaller than the preset parameter 0.5, indicating that the extreme point does not have an offset interpolation center, and the pixel point to be detected corrected in step S108 is determined as a candidate key point; otherwise, the process proceeds to step S116. When it is determined that the new extreme point offset obtained in step S110 has a coordinate offset greater than or equal to 0.5 on any coordinate axis of x, y, and σ, the extreme point is already accurately positioned twice, but the coordinate offsets on the coordinate axes x, y, and σ included in the extreme point offset are still not satisfied and are less than 0.5, the extreme point corrected in step S108 is abandoned to be continuously accurately positioned again, it is determined that the pixel point corrected in step S108 is not a key point, and then step S116 is performed.
In the foregoing embodiment, according to the extreme point offset obtained by accurately positioning the extreme point, the iteration mode of the subsequent accurate positioning of the extreme point is defined, which includes the number of iterations, the iteration condition, and the extreme point offset convergence result obtained by the iterations, so that the pixel coordinate information of the pixel point to be detected is corrected according to the extreme point offset in the iterations. The maximum iteration number of accurate positioning of the extreme point is set to be 2 based on the algorithm performance and the hardware circuit area, and the interpolation processing power consumption of the pixel point to be detected and hardware resources used for algorithm hardware are effectively reduced.
Step S112, removing the extreme points with low contrast and edge response, which is equivalent to screening and removing the partial invalid extreme points obtained by the precise positioning of the extreme points in the previous step, and then step S113 is performed. The low contrast screening and the elimination of the edge effect are common technical means for eliminating unstable extreme points in the prior SIFT algorithm, and are not repeated. Step S112 is to further screen and correct the coordinate position of the pixel to be detected based on the extreme point accurate positioning and the pixel point offset processing in step S110, so as to enhance the anti-noise capability of the pixel.
Step S113, after the extreme point screening process of step S112, determining the modified pixel point to be detected after removing the invalid extreme point as the key point of the image layer, and then entering step S116.
Step S114, when it is determined in step S107 that the coordinate offsets of x, y, and σ coordinate axes included in the offset of the extreme point are all smaller than the preset parameter 0.5, indicating that the pixel point to be detected has no offset interpolation center and becomes the candidate key point, removing the extreme point with low contrast and edge response, which is equivalent to removing part of invalid extreme points obtained by the accurate positioning of the extreme point in step S105 by screening, and then entering step S115. Because low contrast screening and edge effect elimination are common technical means for eliminating unstable extreme points in the prior SIFT algorithm, further description is omitted. Step S114 is to further screen and correct the coordinate position of the pixel point to be detected on the basis of the accurate positioning of the extreme point in step S105, so as to enhance the anti-noise capability of the pixel point to be detected.
Step S115, determining the modified pixel point to be detected from which the invalid extreme point is removed as a key point of the image layer after the extreme point screening process of step S114, and then entering step S116.
Step S116, determining whether to detect the pixel points to be detected on all image levels in the current set of difference gaussian scale spaces, that is, the pixel points corresponding to the 12 black dot marks in fig. 3, if yes, returning to step S101, and re-collecting the pixel points on each image level in the next set of difference gaussian scale spaces, otherwise, entering step S117. After the key point precision is implemented in the foregoing steps, obtaining the offset of the extreme point, and when the offset of the extreme point exists in any coordinate axis of x, y, and σ and is greater than or equal to 1.5, or on the basis that the offset of the coordinate on the coordinate axes x, y, and σ included in the offset of the extreme point is less than 1.5, the offset of the coordinate on the coordinate axes x, y, and σ included in the offset of the extreme point still cannot be converged to less than 0.5 through two times of precision positioning of the extreme point, traversing the next pixel point to be detected by implementing the step S116; after the key point is detected and determined, the current pixel coordinate of the key point is marked, and then the next pixel point to be detected is continuously traversed by executing the step S116. Before traversing the next pixel point to be detected, the embodiment first determines whether 12 pixel points to be detected, which participate in the detection of the key point at one time, all complete the detection process of the foregoing steps, and if the number of the pixel points to be detected, which completes the detection process of the foregoing steps, does not reach 12, then step S117 is performed.
It should be noted that the upper pixel information includes a pixel point to be detected and a neighborhood comparison point thereof, and the pixel coordinate information includes a pixel coordinate of any coordinate dimension in the difference gaussian scale space.
The method comprises the following steps of screening detected extreme points according to the numerical value of the offset of the extreme points, carrying out curve fitting through one-time Taylor expansion to find more suitable extreme points to reduce the offset of an interpolation center when the offset of the extreme points meets a certain offset condition of the interpolation center, and finally determining whether a new extreme point is an accurate key point of an image layer, wherein the offset of the extreme points is converged through two times of accurate positioning of the extreme points, the matching relation between algorithm performance and hardware circuit area is fully considered, and the time for detecting and processing the next detection pixel point or the pixel point in the next group of difference scale space is also determined.
Step S117, traversing the pixel point to be detected of the neighborhood from the register, that is, traversing the pixel point to be detected of the pixel point to be detected on the neighborhood participating in the above-mentioned key point detection step, as the next pixel point to be detected of the same image layer or an adjacent image layer in the same set of differential gaussian scale spaces by the SIFT algorithm key point detection method, and then returning to step S102, updating the coordinates of the pixel point to be detected in the same set of differential gaussian scale spaces, and continuing the key point detection process of the next pixel point to be detected.
Preferably, in this embodiment, the pixel point to be detected, which has already participated in the key point detection step of the foregoing step S101 to step S116, is traversed by traversing the comparison point register, so as to traverse the pixel points to be detected adjacent to each other on the same scale image layer, and determines it as the next pixel point to be detected of the same scale image layer, and then outputs the pixel information of the next pixel point to be detected of the same scale image layer cached in the comparison point register to the detection point register, meanwhile, the pixel point to be detected which is originally cached in the detection point register and has participated in the steps from S101 to S116 is output to the comparison point register, the pixel point to be detected which has participated in the steps from S101 to S116 is determined as the neighborhood comparison point of the next detection pixel point, the adjacent pixel point to be detected of the pixel point to be detected and the scale image layer is also a neighborhood comparison point of the pixel point to be detected; or, traversing a comparison point register, wherein the pixel point to be detected, which has participated in the steps S101 to S116, adjacent to the pixel point to be detected on the upper and lower adjacent scale image layers is traversed, and is determined as a next pixel point to be detected on the adjacent scale image layer, then outputting pixel information of the next pixel point to be detected on the adjacent scale image layer, which is cached by the comparison point register, to a detection point register, and simultaneously outputting the pixel point to be detected, which has participated in the steps S101 to S116, which is cached by the detection point register, to the comparison point register, and determining the pixel point to be detected, which has participated in the steps S101 to S116, as a neighborhood comparison point of the next detection pixel point, wherein the adjacent pixel point to be detected on the adjacent scale image layer of the pixel point to be detected is also a neighborhood; wherein the registers include a compare point register and a detect point register. According to the embodiment, the complexity of acquiring the region information of the point to be detected from the memory is reduced, so that the algorithm efficiency is improved, the hardware circuit area is saved, and the detection real-time performance of the key points of the SIFT algorithm is improved.
In the steps, the image layers and the pixel points thereof required for constructing the difference Gaussian scale space are unified in a key point detection space in a mode of acquiring multi-scale image layers and multi-point image pixels by a register at one time, so that the complexity of hardware address-taking operation is reduced, and the requirement of rapid data processing of special hardware is met; the accurate key point is determined by extreme point detection and twice extreme point accurate positioning of the extreme point, and the complexity of acquiring the regional information of the point to be detected is greatly reduced. Therefore, the method effectively and greatly simplifies the data acquisition process and the address management complexity, and reduces the hardware process complexity, thereby improving the algorithm efficiency, saving the hardware circuit area and improving the SIFT algorithm key point detection real-time performance.
In the foregoing embodiment, the method for accurately positioning the extreme point includes: when the pixel point to be detected is determined to be the extreme point, controlling the extreme point to perform Taylor expansion at the position of the interpolation center, then solving the Taylor expansion equation to derive an equation corresponding to zero, and obtaining the offset of the pixel point to be detected relative to the interpolation center; wherein, the Taylor expansion is used as an approach method to obtain a high-precision function. In the embodiment, a taylor expansion formula is used to realize finer finding of mantissas after decimal points of coordinates of extreme points in the process of accurate extreme point positioning, when the obtained offset of the extreme points exists in any coordinate axis of x, y and sigma and is greater than or equal to 0.5, the current interpolation point deviates from a set interpolation center, at this time, the interpolation center should be changed according to the currently obtained offset of the extreme points, the taylor expansion formula is used at the changed interpolation center, then, an equation corresponding to zero is derived by solving the taylor expansion formula, a new offset of the extreme points is obtained, and if the x, y and sigma coordinate offsets included in the offset of the extreme points are all less than 0.5, the extreme points are accurately positioned, which is equivalent to rounding up and processing data after decimal points of the coordinates of the extreme points. The hardware algorithm considers the performance of the algorithm and the area of a hardware circuit, sets the maximum iteration number to be 2, and shortens the operation period. The extreme point offset comprises coordinate offset of any space coordinate dimension of a difference Gaussian scale space. The embodiment converts discrete points in the space into fitting by using a continuous curve, and reduces the error of finding the real extreme point in the discrete space.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention and not to limit it; although the present invention has been described in detail with reference to preferred embodiments, those skilled in the art will understand that: modifications to the specific embodiments of the invention or equivalent substitutions for parts of the technical features may be made; without departing from the spirit of the present invention, it is intended to cover all aspects of the invention as defined by the appended claims.

Claims (7)

1. A hardware circuit-based SIFT algorithm key point detection method is characterized by comprising the following steps:
step 1, synchronously collecting all pixel points of each image layer in a group of difference Gaussian scale spaces, wherein the pixel points of the adjacent 3 image layers in the middle of the group of difference Gaussian scale spaces comprise the same number of pixel points to be detected and neighborhood comparison points thereof;
step 2, obtaining and updating pixel information of the pixel point to be detected and the neighborhood comparison point thereof cached in the register;
step 3, establishing a target pixel detection area taking the pixel point to be detected as the center according to the pixel information of the pixel point to be detected and the neighborhood comparison points thereof;
step 4, judging whether the pixel point to be detected is an extreme point in the target pixel detection area or not by comparing the magnitude relation of the pixel information of the pixel point to be detected and the neighborhood comparison points thereof;
step 5, entering step 7 when the pixel point to be detected is not the extreme point in step 4;
step 6, when the pixel point to be detected is detected to be the extreme point in the step 4, limiting an iteration mode of subsequent extreme point accurate positioning according to the extreme point offset obtained by accurately positioning the extreme point, enabling the pixel coordinate information of the pixel point to be detected to finish offset correction according to the extreme point offset in the iteration, and removing invalid extreme points to obtain key points; the pixel coordinate information comprises pixel coordinates of any coordinate dimension in a difference Gaussian scale space;
and 7, judging whether the to-be-detected pixel points of all image layers in the current group of difference Gaussian scale spaces complete the key point detection, if so, returning to the step 1 to collect the pixel points of all image layers in the next group of difference Gaussian scale spaces, otherwise, traversing the to-be-detected pixel points of the to-be-detected pixel points participating in the steps 1 to 6 in the neighborhood from the register, and returning to the step 2.
2. The SIFT algorithm key point detection method according to claim 1, wherein the number of image layers in the set of difference gaussian scale spaces is 5, and the pixel points to be detected are distributed in the 3 image layers adjacent to each other in the middle of the set of difference gaussian scale spaces, so that the target pixel detection area occupies 3 arbitrary adjacent layers in the set of difference gaussian scale spaces.
3. The SIFT algorithm key point detecting method according to claim 2, wherein the step 4 specifically comprises:
comparing the pixel value of the pixel point to be detected with the pixel values of 8 adjacent comparison points of the same-scale image layer, simultaneously comparing the pixel value of the pixel point to be detected with the pixel values of 9 multiplied by 2 corresponding adjacent comparison points of upper and lower adjacent-scale image layers, and when the pixel value of the pixel point to be detected is the maximum value or the minimum value, determining that the pixel point to be detected is the extreme value point in the target pixel detection area;
the pixel value is a ternary function value formed by coordinates (x, y) of the pixel point to be detected and a neighborhood comparison point thereof and a scale sigma of the image layer, wherein the pixel coordinate of any coordinate dimension comprises pixel coordinates on an x axis, a y axis and a sigma axis; the neighborhood comparison points comprise adjacent comparison points of the pixel points to be detected on the same scale image layer and corresponding adjacent comparison points of the upper and lower adjacent scale image layers.
4. The SIFT algorithm key point detecting method according to claim 1, wherein the specific method of the step 6 comprises:
in the target pixel detection area, taking the extreme point detected in the step 3 as an interpolation center position, and then obtaining the offset of the extreme point through one-time extreme point accurate positioning, wherein the offset of the extreme point comprises coordinate offsets of the extreme point on an x axis, a y axis and a sigma axis;
when the coordinate offset included by the extreme point offset is smaller than the preset parameter, removing the extreme points with low contrast and edge response, and determining the pixel point to be detected as the key point of the image layer;
when the coordinate offset which is greater than or equal to the first empirical value exists in the coordinate offsets included in the extreme point offset, abandoning to continue detecting the extreme point detected in the step 3, and returning to the step 7; wherein the first empirical value is greater than the preset parameter;
when the coordinate offset included by the extreme point offset is smaller than a first experience value and the coordinate offset which is larger than or equal to the preset parameter exists, correcting the pixel coordinate information of the pixel point to be detected according to the extreme point offset, so that the pixel point to be detected is offset to the position of a new interpolation center; then, acquiring pixel information of a neighborhood comparison point of the pixel to be detected after pixel coordinate information is corrected, establishing a new target pixel detection area by taking the pixel to be detected which is shifted to a new interpolation center as a center, performing one-time extreme point accurate positioning in the new target pixel detection area to acquire a new extreme point offset, judging whether the coordinate offsets included in the new extreme point offset are all smaller than the preset parameter, if so, removing the extreme points with low contrast and edge response, then determining the corrected pixel to be detected after removing the invalid extreme point as a key point of the image layer, and if not, returning to the step 7; wherein the extreme point of low contrast and edge response is the invalid extreme point.
5. The SIFT algorithm keypoint detection method of claim 4, wherein the extreme point precise positioning method comprises:
and when the pixel point to be detected is determined to be the extreme point, controlling the extreme point to perform Taylor expansion at the position of the interpolation center, then solving a Taylor expansion formula, and obtaining the offset of the pixel point to be detected relative to the interpolation center.
6. The SIFT algorithm key point detecting method according to claim 4, wherein the predetermined parameter is set to 0.5, and the first experience value is set to 1.5.
7. The SIFT algorithm key point detecting method according to claim 1, wherein in the step 6, the specific method for traversing the pixel point to be detected on the neighborhood of the pixel point to be detected, which has participated in the steps 1 to 6, from the register is as follows:
traversing a comparison point register, namely traversing a to-be-detected pixel point adjacent to the to-be-detected pixel point participating in the steps 1 to 6 on the same-scale image layer, determining the to-be-detected pixel point as a next to-be-detected pixel point of the same-scale image layer, outputting pixel information of the next to-be-detected pixel point of the same-scale image layer cached by the comparison point register to a detection point register, outputting the to-be-detected pixel point which is originally cached by the detection point register and participates in the steps 1 to 6 to the comparison point register, and determining the to-be-detected pixel point which participates in the steps 1 to 6 as a neighborhood comparison point of the next to-be-detected pixel point, wherein the adjacent to-be-detected pixel point of the same-scale image layer is also the neighborhood comparison point;
or traversing a comparison point register, traversing adjacent to-be-detected pixel points of the to-be-detected pixel points participating in the steps 1 to 6 on an upper and lower adjacent scale image layer, determining the adjacent to-be-detected pixel points as next to-be-detected pixel points of the adjacent scale image layer, outputting pixel information of the next to-be-detected pixel points of the adjacent scale image layer cached by the comparison point register to a detection point register, outputting the to-be-detected pixel points participating in the steps 1 to 6 originally cached by the detection point register to the comparison point register, and determining the to-be-detected pixel points participating in the steps 1 to 6 as neighborhood comparison points of the next to-be-detected pixel point, wherein the adjacent to-be-detected pixel points of the adjacent scale image layer are also the neighborhood comparison points of the to-be-detected pixel points;
wherein the registers include a compare point register and a detect point register.
CN201910736007.2A 2019-08-09 2019-08-09 SIFT algorithm key point detection method based on hardware circuit Active CN112348032B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910736007.2A CN112348032B (en) 2019-08-09 2019-08-09 SIFT algorithm key point detection method based on hardware circuit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910736007.2A CN112348032B (en) 2019-08-09 2019-08-09 SIFT algorithm key point detection method based on hardware circuit

Publications (2)

Publication Number Publication Date
CN112348032A true CN112348032A (en) 2021-02-09
CN112348032B CN112348032B (en) 2022-10-14

Family

ID=74366946

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910736007.2A Active CN112348032B (en) 2019-08-09 2019-08-09 SIFT algorithm key point detection method based on hardware circuit

Country Status (1)

Country Link
CN (1) CN112348032B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117037272A (en) * 2023-08-08 2023-11-10 深圳市震有智联科技有限公司 Method and system for monitoring fall of old people

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413326A (en) * 2013-08-12 2013-11-27 上海盈方微电子股份有限公司 Method and device for detecting feature points in Fast approximated SIFT algorithm
CN103593850A (en) * 2013-11-26 2014-02-19 北京航空航天大学深圳研究院 SIFT parallelization system and method based on recursion Gaussian filtering on CUDA platform
CN106960451A (en) * 2017-03-13 2017-07-18 西安电子科技大学 A kind of method for lifting the weak texture region characteristic point quantity of image
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
CN108734179A (en) * 2018-05-22 2018-11-02 东南大学 The method of SIFT key points description based on hardware realization optimization
CN109522906A (en) * 2018-10-23 2019-03-26 天津大学 The quick SIFT feature extracting method of low complex degree based on FPGA

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413326A (en) * 2013-08-12 2013-11-27 上海盈方微电子股份有限公司 Method and device for detecting feature points in Fast approximated SIFT algorithm
CN103593850A (en) * 2013-11-26 2014-02-19 北京航空航天大学深圳研究院 SIFT parallelization system and method based on recursion Gaussian filtering on CUDA platform
CN106960451A (en) * 2017-03-13 2017-07-18 西安电子科技大学 A kind of method for lifting the weak texture region characteristic point quantity of image
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
CN108734179A (en) * 2018-05-22 2018-11-02 东南大学 The method of SIFT key points description based on hardware realization optimization
CN109522906A (en) * 2018-10-23 2019-03-26 天津大学 The quick SIFT feature extracting method of low complex degree based on FPGA

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王玉亮等: "基于CUDA的眼底图像快速自动配准与拼接", 《中国机械工程》 *
胥陈彧: "SIFT算法的图像特征处理模块的芯片设计研究", 《中国优秀博硕士学位论文全文数据库(硕士)》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117037272A (en) * 2023-08-08 2023-11-10 深圳市震有智联科技有限公司 Method and system for monitoring fall of old people
CN117037272B (en) * 2023-08-08 2024-03-19 深圳市震有智联科技有限公司 Method and system for monitoring fall of old people

Also Published As

Publication number Publication date
CN112348032B (en) 2022-10-14

Similar Documents

Publication Publication Date Title
CN107301654B (en) Multi-sensor high-precision instant positioning and mapping method
Jiang et al. Multiscale locality and rank preservation for robust feature matching of remote sensing images
Moulon et al. Adaptive structure from motion with a contrario model estimation
CN107228860B (en) Gear defect detection method based on image rotation period characteristics
CN109685732B (en) High-precision depth image restoration method based on boundary capture
CN109993800A (en) A kind of detection method of workpiece size, device and storage medium
CN104867137B (en) A kind of method for registering images based on improvement RANSAC algorithms
CN109035170B (en) Self-adaptive wide-angle image correction method and device based on single grid image segmentation mapping
US8068673B2 (en) Rapid and high precision centroiding method and system for spots image
CN102169581A (en) Feature vector-based fast and high-precision robustness matching method
CN111369495A (en) Video-based panoramic image change detection method
CN114743259A (en) Pose estimation method, pose estimation system, terminal, storage medium and application
CN110544202A (en) parallax image splicing method and system based on template matching and feature clustering
CN108537832B (en) Image registration method and image processing system based on local invariant gray feature
CN112348032B (en) SIFT algorithm key point detection method based on hardware circuit
Chen et al. Automatic checkerboard detection for robust camera calibration
CN110390338B (en) SAR high-precision matching method based on nonlinear guided filtering and ratio gradient
Xu et al. Hierarchical convolution fusion-based adaptive Siamese network for infrared target tracking
CN116503462A (en) Method and system for quickly extracting circle center of circular spot
Min et al. Non-rigid registration for infrared and visible images via gaussian weighted shape context and enhanced affine transformation
JP2017130067A (en) Automatic image processing system for improving position accuracy level of satellite image and method thereof
CN111476812A (en) Map segmentation method and device, pose estimation method and equipment terminal
CN112950650B (en) Deep learning distorted light spot center extraction method suitable for high-precision morphology measurement
CN117496401A (en) Full-automatic identification and tracking method for oval target points of video measurement image sequences
CN107392948A (en) A kind of method for registering images of point of amplitude real-time polarization imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: 519000 room 105-514, No. 6, Baohua Road, Hengqin new area, Zhuhai, Guangdong

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.

GR01 Patent grant
GR01 Patent grant