CN110245671B - Endoscope image feature point matching method and system - Google Patents

Endoscope image feature point matching method and system Download PDF

Info

Publication number
CN110245671B
CN110245671B CN201910521138.9A CN201910521138A CN110245671B CN 110245671 B CN110245671 B CN 110245671B CN 201910521138 A CN201910521138 A CN 201910521138A CN 110245671 B CN110245671 B CN 110245671B
Authority
CN
China
Prior art keywords
feature point
matching
point
image
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910521138.9A
Other languages
Chinese (zh)
Other versions
CN110245671A (en
Inventor
杨峰
李文杰
李旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airui Maidi Technology Shijiazhuang Co ltd
Original Assignee
Airui Maidi Technology Shijiazhuang Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airui Maidi Technology Shijiazhuang Co ltd filed Critical Airui Maidi Technology Shijiazhuang Co ltd
Priority to CN201910521138.9A priority Critical patent/CN110245671B/en
Publication of CN110245671A publication Critical patent/CN110245671A/en
Application granted granted Critical
Publication of CN110245671B publication Critical patent/CN110245671B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Endoscopes (AREA)

Abstract

The embodiment of the invention provides a method and a system for matching endoscope image feature points. Extracting image characteristic points based on two endoscope images to be matched, completing image characteristic point matching based on the similarity of the characteristic descriptors, and acquiring an initial matching point pair set; local distance constraint is carried out on the initial matching point pair set, affine parameters and corresponding feature point motion information are combined to expand feature point information, similarity boundaries are estimated, and a feature point corresponding relation set with motion consistency is obtained; and optimizing spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting adaptive distance threshold parameters of bilateral motion boundaries, obtaining an interior point matching set corresponding to the global image, and realizing feature point matching. The embodiment of the invention finds the reliable corresponding relation from the given initial matching pair by a motion consistency method based on locality preservation, and reserves enough feature point matching pairs while ensuring high precision.

Description

Endoscope image feature point matching method and system
Technical Field
The invention relates to the field of image processing, in particular to a method and a system for matching endoscope image feature points.
Background
The endoscope is an optical instrument, consists of a cold light source lens, a fiber optic guide wire, an image transmission system, a screen display system and the like, can expand the operation visual field, and has the outstanding characteristics of small operation incision, unobvious incision scar, light postoperative reaction, greatly reduced bleeding, bluish purple and swelling time and quicker recovery compared with the traditional operation. In the field of computer-assisted medical endoscopic surgery, it is common to conduct related image processing to guide minimally invasive surgery, and establishing reliable correspondence between different images is a key issue in numerous clinical applications, such as tissue surface reconstruction, camera motion estimation, and surgical navigation.
The existing endoscope image Feature point matching method generally adopts a common detection and description image local Feature algorithm, for example, an Affine Scale Invariant Feature transformation method (hereinafter referred to as ASIFT) and the like is used to extract Feature points of an image, and corresponding Feature descriptors are used to match the Feature points to obtain a Feature point correspondence relationship between the images, but because the general texture information of an organization structure in an endoscope image is weak, problems such as occlusion and deformation exist, and it is difficult to obtain a satisfactory matching result by simply depending on the similarity of the Feature descriptors.
The existing endoscope image feature point matching method mainly has the following problems: the number of extracted feature points is insufficient under the weak texture condition, the number of the finally reserved matching results is insufficient, and detailed surface or structure reconstruction is difficult to realize; when the threshold is changed to extract enough feature points, because the texture information is weak, the factors such as deformation exist, the number of wrong matches is large, a large number of outliers cannot be effectively eliminated, and the matching precision is reduced.
For clinical medical image processing, obtaining a sufficient number of matching results with high precision is the key for the success of image-guided minimally invasive surgery, so a method for matching the feature points of the endoscopic image with high precision needs to be provided.
Disclosure of Invention
The embodiment of the invention provides an endoscope image feature point matching method and system, which are used for solving the problems of insufficient image feature point extraction quantity, low matching precision, influence of a large number of interference noise points and the like in the prior art.
In a first aspect, an embodiment of the present invention provides an endoscope image feature point matching method, including:
s1, extracting image feature points based on two endoscope images to be matched, completing image feature point matching based on the similarity of feature descriptors, and acquiring an initial matching point pair set of the image feature points;
s2, performing local distance constraint on the initial matching point pair set based on the unknown deformation between the two endoscope images to be matched, the corresponding relation between the local neighborhood structures of the image feature points and the local structures, expanding the feature point information by combining affine parameters and the motion information of the corresponding feature points, and estimating similarity boundaries to obtain a feature point corresponding relation set with motion consistency;
s3, optimizing the spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting adaptive distance threshold parameters of bilateral motion boundaries, obtaining an interior point matching set corresponding to a global image, and realizing feature point matching between the two endoscope images to be matched.
Wherein the step of S2 specifically includes:
s21, establishing a corresponding relation between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point;
s22, constructing the local structure based on each of the image feature points at the respective 6 closest points in the corresponding point set;
s23, setting a threshold parameter based on the corresponding relation between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point, and the distance between any initial matching point pair and the neighborhood point thereof is fixed, calculating and obtaining the inner point set with locality preservation, and realizing local distance constraint;
s24, expanding the feature point information extracted from the two images to be matched, adding the motion information corresponding to the feature points, and applying a similarity boundary function to obtain a feature point corresponding relation set with the motion consistency.
Wherein the step S22 includes performing calculation based on euclidean distance formula, resulting in the local structure constructed based on each of the image feature points and its corresponding 6 closest points in the corresponding point set.
In step S24, a similarity boundary function is applied to obtain the feature point correspondence set with motion consistency, which is obtained based on the interior point set with locality preservation.
Wherein the step of S3 specifically includes:
s31, based on the feature point corresponding relation set, applying an affine motion boundary with bilateral change to obtain a bilateral affine motion consistency model;
s32, setting the self-adaptive spatial threshold parameter of the bilateral motion boundary to obtain an interior point matching set corresponding to the global image, and realizing feature point matching between the two endoscope images to be matched.
Wherein, the setting of the adaptive spatial threshold parameter of the bilateral motion boundary in step S32 specifically includes:
and setting the self-adaptive spatial threshold parameter of the bilateral motion boundary to be consistent with the distance constraint threshold parameter of the local structure in combination with a local structure maintenance model.
The step S32 of calculating and obtaining the interior point matching set corresponding to the global image specifically includes:
and obtaining an interior point matching set corresponding to the global image by setting estimation results of the image feature points in two directions of a bilateral motion boundary and a distance threshold value between the image feature point noise observation data.
In a second aspect, an embodiment of the present invention provides an endoscope image feature point matching system, including:
the first processing module is used for extracting image characteristic points based on two endoscope images to be matched, completing image characteristic point matching based on the similarity of the characteristic descriptors and acquiring an initial matching point pair set of the image characteristic points;
the second processing module is used for carrying out local distance constraint on the initial matching point pair set based on the unknown deformation between the two endoscope images to be matched, the corresponding relation between the local neighborhood structures of the image feature points and the local structures, expanding the feature point information by combining affine parameters and the motion information of the corresponding feature points, and estimating similarity boundaries to obtain a feature point corresponding relation set with motion consistency;
and the third processing module is used for optimizing the spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting self-adaptive distance threshold parameters of a bilateral motion boundary, obtaining an interior point matching set corresponding to a global image, and realizing feature point matching between two endoscope images to be matched.
In a third aspect, an embodiment of the present invention provides an electronic device, including:
the endoscope image feature point matching method comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the steps of any one of the endoscope image feature point matching methods when executing the program.
In a fourth aspect, embodiments of the present invention provide a non-transitory computer-readable storage medium on which is stored a computer program that, when executed by a processor, implements the steps of any one of the endoscopic image feature point matching methods.
According to the endoscope image feature point matching method and system provided by the embodiment of the invention, a reliable corresponding relation is found from a given initial matching pair containing a large number of outliers through a motion consistency method based on locality preservation, and enough feature point matching pairs are preserved while high precision is ensured, so that the method and system are beneficial to realizing more detailed tissue surface reconstruction, more accurate camera motion estimation, surgical navigation and other applications.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of an endoscope image feature point matching method according to an embodiment of the present invention;
fig. 2 is a detailed flowchart of step S2 in the method for matching feature points of an endoscopic image according to an embodiment of the present invention;
fig. 3 is a detailed flowchart of step S3 in the method for matching feature points of an endoscopic image according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an endoscope image feature point matching system according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a second processing module of the endoscope image feature point matching system according to the embodiment of the present invention;
fig. 6 is a schematic structural diagram of a third processing module of the endoscope image feature point matching system according to the embodiment of the present invention;
FIG. 7 is a flow chart of a matching algorithm provided by an embodiment of the present invention;
FIG. 8 is a flow chart of a motion consistency constraint algorithm based on local structure preservation according to an embodiment of the present invention;
FIG. 9 is a flowchart of an adaptive threshold based bilateral motion boundary constraint algorithm according to an embodiment of the present invention;
fig. 10 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The existing endoscope image feature point matching method has the following problems: the number of extracted feature points is insufficient under the weak texture condition, the number of the finally reserved matching results is insufficient, and detailed surface or structure reconstruction is difficult to realize; when the threshold is changed to extract enough feature points, because the texture information is weak, the factors such as deformation exist, the number of wrong matches is large, a large number of outliers cannot be effectively eliminated, and the matching precision is reduced.
To solve the problems in the prior art, an embodiment of the present invention provides an endoscope image feature point matching method, and fig. 1 is a flowchart of an endoscope image feature point matching method provided by an embodiment of the present invention, as shown in fig. 1, including:
s1, extracting image feature points based on two endoscope images to be matched, completing image feature point matching based on the similarity of feature descriptors, and acquiring an initial matching point pair set of the image feature points;
s2, performing local distance constraint on the initial matching point pair set based on the unknown deformation between the two endoscope images to be matched, the corresponding relation between the local neighborhood structures of the image feature points and the local structures, expanding the feature point information by combining affine parameters and the motion information of the corresponding feature points, and estimating similarity boundaries to obtain a feature point corresponding relation set with motion consistency;
s3, optimizing the spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting adaptive distance threshold parameters of bilateral motion boundaries, obtaining an interior point matching set corresponding to a global image, and realizing feature point matching between the two endoscope images to be matched.
Specifically, two endoscopic image feature points to be matched are first extracted in step S1, assuming that a given pair of endoscopic images are I respectivelyr(reference image) and It(template image), using an image feature extraction algorithm, in this embodiment, using an ASIFT algorithm, to respectively extract features in two images to obtain a corresponding feature point set FrAnd Ft. Because the tissue in the endoscope image is smooth and the texture information is weak, all possible matching results need to be considered, and the similarity of the feature descriptors is utilized to obtain an initial feature point corresponding relation which is defined as
Figure BDA0002096707680000061
The purpose is to eliminate SoTo obtain reliable correspondences. In step S2, first, a correspondence between unknown deformations between two endoscope images to be matched and the local neighborhood structure of the image feature points is established, local distance constraint is performed on the set of initial matching point pairs obtained in step S1 based on the correspondence and the local structure, then affine parameters and corresponding feature point motions are introduced to expand feature point information, and then similarity boundaries are estimated to achieve consistency constraint on the feature point motions. And step S3, optimizing space distance perception on the basis of all feature point sets meeting the similarity boundary generated in step S2 to obtain a bilateral affine motion consistency model, and meanwhile, taking the average distance of a local structure as a self-adaptive distance threshold of the bilateral motion boundary model to perform judgment and calculation so as to eliminate noise outliers in the image and obtain a final feature point matching result.
The overall matching algorithm flow is shown in fig. 7, and fig. 7 is a matching algorithm flow chart provided by the embodiment of the present invention.
The embodiment of the invention finds the reliable corresponding relation from the given initial matching pair containing a large number of outliers by a motion consistency method based on locality preservation, and reserves enough feature point matching pairs while ensuring high precision, thereby being beneficial to realizing more detailed tissue surface reconstruction, more accurate camera motion estimation, surgical navigation and other applications.
On the basis of the foregoing embodiment, referring to fig. 2 for specific steps of step S2, fig. 2 is a specific flowchart of step S2 in the endoscope image feature point matching method according to the embodiment of the present invention, as shown in fig. 2, including:
s21, establishing a corresponding relation between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point;
s22, constructing the local structure based on each of the image feature points at the respective 6 closest points in the corresponding point set;
s23, setting a threshold parameter based on the corresponding relation between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point, and the distance between any initial matching point pair and the neighborhood point thereof is fixed, calculating and obtaining the inner point set with locality preservation, and realizing local distance constraint;
s24, expanding the feature point information extracted from the two images to be matched, adding the motion information corresponding to the feature points, and applying a similarity boundary function to obtain a feature point corresponding relation set with the motion consistency.
Wherein the step S22 includes performing calculation based on euclidean distance formula, resulting in the local structure constructed based on each of the image feature points and its corresponding 6 closest points in the corresponding point set.
In step S24, a similarity boundary function is applied to obtain the feature point correspondence set with motion consistency, which is obtained based on the interior point set with locality preservation.
Specifically, step S21 assumes that unknown deformation occurs between a pair of endoscope images, but the local neighborhood structure of the corresponding feature point does not change freely, and here, only the local neighborhood structure may be constrained, and a corresponding relationship between the unknown deformation between two images to be matched and the local neighborhood structure of the feature point of the two images to be matched is found, and we will express the task of establishing the above reliable corresponding relationship as finding the optimal solution t of the local loss function C, which is expressed by the following formula:
Figure BDA0002096707680000071
where t is a binary vector of Nx 1, when t isn1 and tnEach corresponding point (x) is represented by 0n,yn) For inliers or outliers, a parameter xi is defined to balance the weights of the summation of the first and second terms in the penalty function C, and xi>0, parameter N indicates the number of pairs of initial matching points, parametercnThe local distance structure is calculated by the following formula:
Figure BDA0002096707680000072
where d is a distance matrix defining two values, NxAnd NyAnd respectively representing the neighborhood point sets of the feature points x and y.
Step S22, based on the calculation formula of Euclidean distance, constructing a local structure by finding the corresponding 6 closest points of each feature point in the corresponding point set, defined as k here, when
Figure BDA0002096707680000073
Time d (x)n,xm) When is equal to 0
Figure BDA0002096707680000081
Time d (x)n,xm)=1,d(yn,ym) The same applies to the definition of (1).
In step S23, since the initial image feature point correspondence relationship has been determined in step S21 and the distance between any initial matching point pair and its neighborhood point is fixed, it is judged by setting a threshold parameter ζ when c is satisfiednWhen is less than or equal to xi, tnSolving the optimal solution t of the locality loss function C to obtain an inner point set with locality preservation, using 1
Figure BDA0002096707680000082
This set of inliers is represented as follows:
Figure BDA0002096707680000083
next, in step S24, description information of the image feature points is extracted from the only position information x by adding the motion information and the radiation informationn=[xn,yn]Extend to xn=[xn,yn;vn;on]Where n is the index of the point, vn=[un,vn]Representing the movement of the corresponding feature point in the X and Y directions, on=[scale,orientation,tilt,rotation]Is affine information of the feature descriptors in the ASIFT algorithm. In collections with locality preservation
Figure BDA0002096707680000084
On the basis, a similarity boundary function is applied
Figure BDA0002096707680000085
To achieve motion consistency, the following formula is obtained:
Figure BDA0002096707680000086
Figure BDA0002096707680000087
where H (.) denotes the Huber function, the function predictor used for penalty estimation deviates from the assumed observation "1". G (i, j) is a compound containing an element of
Figure BDA0002096707680000088
Where γ is the standard deviation and λ is the weight of the smoothing term.
Figure BDA0002096707680000089
Is an M-dimensional gaussian kernel weight vector,
Figure BDA00020967076800000810
representing M characteristic cluster centers at characteristic point { xjThe distribution of the space. Therefore, the minimum value in the formula can estimate the optimal parameter
Figure BDA00020967076800000811
As a result of (3), we can get
Figure BDA00020967076800000812
Bringing in
Figure BDA00020967076800000813
In the expression (2) calculating the similarity boundary
Figure BDA00020967076800000814
By verifying all initial matching results SoWhether a threshold condition under the similarity boundary is satisfied is set as epsilonlikelihoodObtaining a corresponding relation set with motion consistency
Figure BDA00020967076800000815
Figure BDA00020967076800000816
Referring to fig. 8, a flowchart of a complete algorithm from step 21 to step 24 in the foregoing embodiment is shown, where fig. 8 is a flowchart of a motion consistency constraint algorithm based on local structure maintenance provided by an embodiment of the present invention.
According to the embodiment of the invention, local distance constraint is carried out on the acquired initial matching point pair set, motion consistency constraint kept based on a local structure is realized, error matching can be accurately eliminated from the corresponding relation of the endoscope image feature points with a large amount of noise, invariance of the local structure in deformation and consistency of global motion are considered, and the matching problem under different types of deformation can be robustly solved.
On the basis of the foregoing embodiment, referring to fig. 3 for specific steps of step S3, fig. 3 is a specific flowchart of step S3 in the endoscope image feature point matching method according to the embodiment of the present invention, as shown in fig. 3, including:
s31, based on the feature point corresponding relation set, applying an affine motion boundary with bilateral change to obtain a bilateral affine motion consistency model;
s32, setting the self-adaptive spatial threshold parameter of the bilateral motion boundary to obtain an interior point matching set corresponding to the global image, and realizing feature point matching between the two endoscope images to be matched.
Wherein, the setting of the adaptive spatial threshold parameter of the bilateral motion boundary in step S32 specifically includes:
and setting the self-adaptive spatial threshold parameter of the bilateral motion boundary to be consistent with the distance constraint threshold parameter of the local structure in combination with a local structure maintenance model.
The step S32 of calculating and obtaining the interior point matching set corresponding to the global image specifically includes:
and obtaining an interior point matching set corresponding to the global image by setting estimation results of the image feature points in two directions of a bilateral motion boundary and a distance threshold value between the image feature point noise observation data. Specifically, in order to make the model include a fine spatial perception capability, in step S31, a correspondence set having motion consistency is used
Figure BDA0002096707680000091
Followed by a bilateral affine motion boundary to obtain a more accurate global model, said set SlbIs obtained in step 24, indicated as X and Y directions respectively
Figure BDA0002096707680000092
And
Figure BDA0002096707680000093
it can be estimated by the following optimization problem:
Figure BDA0002096707680000094
Figure BDA0002096707680000095
Figure BDA0002096707680000096
wherein O iskIs a scalar offset value and k is the index of the different smoothing functions. Calculating to obtain an optimal solution by minimizing the loss value of the loss function
Figure BDA0002096707680000101
And
Figure BDA0002096707680000102
each of the smoothing functions
Figure BDA0002096707680000103
Can be obtained by bringing the optimal solution into
Figure BDA0002096707680000104
Calculated in the expression and obtained at the same time
Figure BDA0002096707680000105
And
Figure BDA0002096707680000106
in step 32, in order to make the local distance constraint play a role in the whole matching process, the spatial distance threshold of the bilateral motion boundary needs to be kept consistent with the local distance constraint. Therefore, in conjunction with the local structure preservation model, an adaptive spatial threshold is set as follows to enable global motion to better incorporate the locality preservation strategy, as follows:
Figure BDA0002096707680000107
by setting a distance threshold d between the estimation result and the noisy observation data in two directionslpWe can get the following set of interior points with high accuracymc
Figure BDA0002096707680000108
Finally, the matching of the feature points between the two endoscope images to be matched is realized.
Referring to fig. 9, a flowchart of a complete algorithm from step 31 to step 32 in the foregoing embodiment is shown, where fig. 9 is a flowchart of an adaptive threshold-based bilateral motion boundary constraint algorithm provided by an embodiment of the present invention.
The embodiment of the invention has self-adaptive constraint in the face of deformation types with different scales through the application of the self-adaptive threshold, is effectively suitable for endoscope images with unknown deformation sizes, can keep enough correct feature point corresponding relations under the condition of ensuring high precision, and is convenient to apply in subsequent operations such as structure reconstruction and the like.
Fig. 4 is a schematic structural diagram of an endoscope image feature point matching system according to an embodiment of the present invention, as shown in fig. 4, including: a first processing module 41, a second processing module 42 and a third processing module 43, wherein: the first processing module 41 is configured to extract image feature points based on two endoscope images to be matched, complete image feature point matching based on feature descriptor similarity, and obtain an initial matching point pair set of the image feature points; the second processing module 42 is configured to perform local distance constraint on the initial matching point pair set based on the unknown deformation between the two endoscope images to be matched, the corresponding relationship between the local neighborhood structures of the image feature points, and the local structures, expand the feature point information by combining the affine parameters and the motion information of the corresponding feature points, and estimate a similarity boundary to obtain a feature point corresponding relationship set with motion consistency; the third processing module 43 is configured to optimize the spatial distance perception based on the feature point correspondence set, generate a bilateral affine motion consistency model, set an adaptive distance threshold parameter of a bilateral motion boundary, obtain an interior point matching set corresponding to a global image, and implement feature point matching between the two endoscope images to be matched.
The system provided by the embodiment of the present invention is used for executing the corresponding method, the specific implementation manner of the system is consistent with the implementation manner of the method, and the related algorithm flow is the same as the algorithm flow of the corresponding method, and is not described herein again.
According to the embodiment of the invention, local distance constraint is carried out on the acquired initial matching point pair set, motion consistency constraint kept based on a local structure is realized, error matching can be accurately eliminated from the corresponding relation of the endoscope image feature points with a large amount of noise, invariance of the local structure in deformation and consistency of global motion are considered, and the matching problem under different types of deformation can be robustly solved.
On the basis of the foregoing embodiment, fig. 5 is a schematic diagram of a sub-structure of a second processing module of the endoscope image feature point matching system according to an embodiment of the present invention, and as shown in fig. 5, the second processing module 42 specifically includes: a matching submodule 421, a building submodule 422, a first computation submodule 423 and a second computation submodule 424, wherein:
the matching submodule 421 is configured to establish a corresponding relationship between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point; a construction sub-module 422 is configured to construct the local structure based on each of the image feature points at a respective 6 closest points in the corresponding set of points; the first calculation submodule 423 is configured to set a threshold parameter based on the correspondence between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point, and the distance between any initial matching point pair and its neighborhood point is fixed, calculate and obtain the inner point set with locality preservation, and implement local distance constraint; the second calculating submodule 424 is configured to expand the feature point information extracted from the two images to be matched, add motion information corresponding to feature points, and apply a similarity boundary function to obtain a feature point correspondence set having the motion consistency.
The system provided by the embodiment of the present invention is used for executing the corresponding method, the specific implementation manner of the system is consistent with the implementation manner of the method, and the related algorithm flow is the same as the algorithm flow of the corresponding method, which is not described herein again.
According to the embodiment of the invention, local distance constraint is carried out on the acquired initial matching point pair set, motion consistency constraint kept based on a local structure is realized, error matching can be accurately eliminated from the corresponding relation of the endoscope image feature points with a large amount of noise, invariance of the local structure in deformation and consistency of global motion are considered, and the matching problem under different types of deformation can be robustly solved.
On the basis of the foregoing embodiment, fig. 6 is a schematic diagram of a sub-structure of a third processing module of the endoscope image feature point matching system according to the embodiment of the present invention, and as shown in fig. 6, the third processing module 43 specifically includes: a third computation submodule 431 and a fourth computation submodule 432, wherein:
the third computation submodule 431 is configured to apply an affine motion boundary with bilateral change to obtain a bilateral affine motion consistency model based on the feature point correspondence set; the fourth computation submodule 432 is configured to set an adaptive spatial threshold parameter of the bilateral motion boundary, obtain an interior point matching set corresponding to the global image, and implement feature point matching between the two endoscopic images to be matched.
The system provided by the embodiment of the present invention is used for executing the corresponding method, the specific implementation manner of the system is consistent with the implementation manner of the method, and the related algorithm flow is the same as the algorithm flow of the corresponding method, which is not described herein again.
The embodiment of the invention has self-adaptive constraint in the face of deformation types with different scales through the application of the self-adaptive threshold, is effectively suitable for endoscope images with unknown deformation sizes, can keep enough correct feature point corresponding relations under the condition of ensuring high precision, and is convenient to apply in subsequent operations such as structure reconstruction and the like.
Fig. 10 illustrates a physical structure diagram of a server, and as shown in fig. 10, the server may include: a processor (processor)1010, a communication Interface (Communications Interface)1020, a memory (memory)1030, and a communication bus 1040, wherein the processor 1010, the communication Interface 1020, and the memory 1030 communicate with each other via the communication bus 1040. Processor 1010 may call logic instructions in memory 1030 to perform the following method: extracting image characteristic points based on two endoscope images to be matched, completing image characteristic point matching based on the similarity of the characteristic descriptors, and acquiring an initial matching point pair set of the image characteristic points; local distance constraint is carried out on the initial matching point pair set on the basis of unknown deformation between the two endoscope images to be matched, the corresponding relation between the local neighborhood structures of the image feature points and the local structures, affine parameters and corresponding feature point motion information are combined to expand feature point information, similarity boundaries are estimated, and a feature point corresponding relation set with motion consistency is obtained; and optimizing the spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting an adaptive distance threshold parameter of a bilateral motion boundary, obtaining an interior point matching set corresponding to a global image, and realizing feature point matching between the two endoscope images to be matched.
Furthermore, the logic instructions in the memory 1030 can be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, an embodiment of the present invention further provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented by a processor to perform the method provided by the foregoing embodiments, for example, including: extracting image characteristic points based on two endoscope images to be matched, completing image characteristic point matching based on the similarity of the characteristic descriptors, and acquiring an initial matching point pair set of the image characteristic points; local distance constraint is carried out on the initial matching point pair set on the basis of unknown deformation between the two endoscope images to be matched, the corresponding relation between the local neighborhood structures of the image feature points and the local structures, affine parameters and corresponding feature point motion information are combined to expand feature point information, similarity boundaries are estimated, and a feature point corresponding relation set with motion consistency is obtained; and optimizing the spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting an adaptive distance threshold parameter of a bilateral motion boundary, obtaining an interior point matching set corresponding to a global image, and realizing feature point matching between the two endoscope images to be matched.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An endoscope image feature point matching method, comprising:
s1, extracting image feature points based on two endoscope images to be matched, completing image feature point matching based on the similarity of feature descriptors, and acquiring an initial matching point pair set of the image feature points;
s2, based on the unknown deformation between the two endoscope images to be matched and the corresponding relation and local structure between the local neighborhood structures of the image feature points, performing local distance constraint on the initial matching point pair set, expanding the feature point information by combining affine parameters and corresponding feature point motion information, and estimating similarity boundaries to obtain a feature point corresponding relation set with motion consistency;
s3, optimizing the spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting adaptive distance threshold parameters of bilateral motion boundaries, obtaining an interior point matching set corresponding to a global image, and realizing feature point matching between the two endoscope images to be matched.
2. The endoscopic image feature point matching method according to claim 1, wherein said step of S2 specifically includes:
s21, establishing a corresponding relation between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point;
s22, constructing the local structure based on each of the image feature points at the respective 6 closest points in the corresponding point set;
s23, setting a threshold parameter based on the corresponding relation between the unknown deformation between the two endoscope images to be matched and the local neighborhood structure of the image feature point, and the distance between any initial matching point pair and the neighborhood point thereof is fixed, calculating and obtaining an inner point set with locality preservation, and realizing local distance constraint;
s24, expanding the feature point information extracted from the two images to be matched, adding the motion information corresponding to the feature points, and applying a similarity boundary function to obtain a feature point corresponding relation set with the motion consistency.
3. An endoscope image feature point matching method according to claim 2, characterized in that said step S22 includes performing calculation based on euclidean distance formula to obtain said local structure constructed based on each of said image feature points and its corresponding 6 closest points in said corresponding point set.
4. The method for matching feature points of an endoscope image according to claim 3, wherein said feature point correspondence set having motion consistency obtained by applying a similarity boundary function in step S24 is obtained based on said interior point set having locality preservation.
5. The endoscope image feature point matching method according to claim 2, wherein the step of S3 specifically includes:
s31, based on the feature point corresponding relation set, applying an affine motion boundary with bilateral change to obtain a bilateral affine motion consistency model;
s32, setting the self-adaptive spatial threshold parameter of the bilateral motion boundary to obtain an interior point matching set corresponding to the global image, and realizing feature point matching between the two endoscope images to be matched.
6. The endoscopic image feature point matching method according to claim 5,
the step S32 of setting the adaptive spatial threshold parameter of the bilateral motion boundary specifically includes:
and setting the self-adaptive spatial threshold parameter of the bilateral motion boundary to be consistent with the distance constraint threshold parameter of the local structure in combination with a local structure maintenance model.
7. The endoscopic image feature point matching method according to claim 6,
the step S32 of calculating and obtaining the interior point matching set corresponding to the global image specifically includes:
and obtaining an interior point matching set corresponding to the global image by setting estimation results of the image feature points in two directions of a bilateral motion boundary and a distance threshold value between the image feature point noise observation data.
8. An endoscopic image feature point matching system, comprising:
the first processing module is used for extracting image characteristic points based on two endoscope images to be matched, completing image characteristic point matching based on the similarity of the characteristic descriptors and acquiring an initial matching point pair set of the image characteristic points;
the second processing module is used for carrying out local distance constraint on the initial matching point pair set based on the unknown deformation between the two endoscope images to be matched, the corresponding relation between the local neighborhood structures of the image feature points and the local structures, expanding the feature point information by combining affine parameters and the motion information of the corresponding feature points, and estimating similarity boundaries to obtain a feature point corresponding relation set with motion consistency;
and the third processing module is used for optimizing the spatial distance perception based on the feature point corresponding relation set, generating a bilateral affine motion consistency model, setting adaptive distance threshold parameters of a bilateral motion boundary, obtaining an interior point matching set corresponding to a global image, and realizing feature point matching between the two endoscope images to be matched.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of an endoscopic image feature point matching method as claimed in any one of claims 1 to 7 are implemented when the processor executes the program.
10. A non-transitory computer-readable storage medium having stored thereon a computer program, which when executed by a processor, implements the steps of an endoscopic image feature point matching method as claimed in any one of claims 1 to 7.
CN201910521138.9A 2019-06-17 2019-06-17 Endoscope image feature point matching method and system Active CN110245671B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910521138.9A CN110245671B (en) 2019-06-17 2019-06-17 Endoscope image feature point matching method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910521138.9A CN110245671B (en) 2019-06-17 2019-06-17 Endoscope image feature point matching method and system

Publications (2)

Publication Number Publication Date
CN110245671A CN110245671A (en) 2019-09-17
CN110245671B true CN110245671B (en) 2021-05-28

Family

ID=67887435

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910521138.9A Active CN110245671B (en) 2019-06-17 2019-06-17 Endoscope image feature point matching method and system

Country Status (1)

Country Link
CN (1) CN110245671B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111639655B (en) * 2020-05-20 2023-10-13 北京百度网讯科技有限公司 Image local information generation method, device, electronic equipment and storage medium
CN112001432B (en) * 2020-08-12 2022-07-08 福建农林大学 Image matching method based on robust feature matching of advanced neighborhood topology consistency
CN112784898B (en) * 2021-01-21 2024-01-30 大连外国语大学 Feature point matching method based on local relative motion consistency clustering
CN113689555B (en) * 2021-09-09 2023-08-22 武汉惟景三维科技有限公司 Binocular image feature matching method and system
CN116385480B (en) * 2023-02-03 2023-10-20 腾晖科技建筑智能(深圳)有限公司 Detection method and system for moving object below tower crane

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101161201A (en) * 2007-11-08 2008-04-16 珠海友通科技有限公司 Method for registrating external circumstance DSA elasticity automatically
CN109008909A (en) * 2018-07-13 2018-12-18 宜宾学院 A kind of low-power consumption capsule endoscope Image Acquisition and three-dimensional reconstruction system
CN109697692A (en) * 2018-12-29 2019-04-30 安徽大学 One kind being based on the similar feature matching method of partial structurtes

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9411824B2 (en) * 2012-10-26 2016-08-09 Lida Huang Method and apparatus for image retrieval

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101161201A (en) * 2007-11-08 2008-04-16 珠海友通科技有限公司 Method for registrating external circumstance DSA elasticity automatically
CN109008909A (en) * 2018-07-13 2018-12-18 宜宾学院 A kind of low-power consumption capsule endoscope Image Acquisition and three-dimensional reconstruction system
CN109697692A (en) * 2018-12-29 2019-04-30 安徽大学 One kind being based on the similar feature matching method of partial structurtes

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于特征点的内窥镜图像和CT影像配准方法;郭晓君 等;《现代商贸工业》;20131001;第25卷(第19期);第194-195页 *

Also Published As

Publication number Publication date
CN110245671A (en) 2019-09-17

Similar Documents

Publication Publication Date Title
CN110245671B (en) Endoscope image feature point matching method and system
JP2021530061A (en) Image processing methods and their devices, electronic devices and computer-readable storage media
US20210224980A1 (en) 3D/2D Vascular Registration Method and Its Means
CN110264509A (en) Determine the method, apparatus and its storage medium of the pose of image-capturing apparatus
CN111696164B (en) Self-adaptive window width and window level adjusting method, device, computer system and storage medium
WO2016195698A1 (en) Method and system for simultaneous scene parsing and model fusion for endoscopic and laparoscopic navigation
US10657625B2 (en) Image processing device, an image processing method, and computer-readable recording medium
US20140168204A1 (en) Model based video projection
JP7417772B2 (en) Three-dimensional posture adjustment method, device, electronic device, and storage medium
CN110009663B (en) Target tracking method, device, equipment and computer readable storage medium
CN111868738B (en) Cross-device monitoring computer vision system
CN112634256A (en) Circle detection and fitting method and device, electronic equipment and storage medium
CN112651389A (en) Method and device for training, correcting and identifying correction model of non-orthoptic iris image
CN110288637A (en) Multi-angle DSA contrastographic picture blood vessel matching process and device
Chu et al. Endoscopic image feature matching via motion consensus and global bilateral regression
JP2009294955A (en) Image processor, image processing method, image processing program and recording medium with the same program recorded thereon
US20210082192A1 (en) Topology-change-aware volumetric fusion for real-time dynamic 4d reconstruction
Ruiz et al. Weighted regularized statistical shape space projection for breast 3D model reconstruction
CN113592971B (en) Virtual human body image generation method, system, equipment and medium
JP7452698B2 (en) Reinforcement learning model for labeling spatial relationships between images
JP7166738B2 (en) Information processing device, information processing method, and program
CN113962957A (en) Medical image processing method, bone image processing method, device and equipment
CN112991152A (en) Image processing method and device, electronic equipment and storage medium
JP2022509564A (en) Neural network active training system and image processing system
Zhang et al. Robust feature matching for VSLAM in non-rigid scenes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant