CN112288009A - R-SIFT chip hardware Trojan horse image registration method based on template matching - Google Patents

R-SIFT chip hardware Trojan horse image registration method based on template matching Download PDF

Info

Publication number
CN112288009A
CN112288009A CN202011183145.1A CN202011183145A CN112288009A CN 112288009 A CN112288009 A CN 112288009A CN 202011183145 A CN202011183145 A CN 202011183145A CN 112288009 A CN112288009 A CN 112288009A
Authority
CN
China
Prior art keywords
matching
image
points
point
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011183145.1A
Other languages
Chinese (zh)
Inventor
李玲玲
郭晓惠
孙宸
马晶晶
焦李成
刘芳
梁普江
刘旭
张梦漩
张丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202011183145.1A priority Critical patent/CN112288009A/en
Publication of CN112288009A publication Critical patent/CN112288009A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a template matching-based R-SIFT chip hardware Trojan horse image registration method, which comprises the steps of constructing a multi-scale pyramid; performing relevant matching on the multi-scale pyramid to obtain a coarse matching result image; performing Gaussian filtering and downsampling processing on the rough matching result image to construct a multi-scale space; detecting extreme points of the image in different scale spaces; assigning the direction of the extreme point to obtain the position, scale and direction information of the extreme point and generating a feature descriptor; judging whether key points in the two images are similar or not by using the Euclidean distance ratio of the nearest neighbor feature descriptors to the next nearest neighbor feature descriptors, preliminarily generating a pre-matching point pair, and expressing the pre-matching point pair by using a matching matrix; and after the pre-matching matrix is determined, correcting by using a RANSAC algorithm, removing redundant matching points and finishing image matching. The method can avoid gradient abnormality caused by abnormal values such as noise and the like, and improve the stability and the accuracy of the image registration method.

Description

R-SIFT chip hardware Trojan horse image registration method based on template matching
Technical Field
The invention belongs to the technical field of image recognition, and particularly relates to a registration method of an R-SIFT chip hardware Trojan horse image based on template matching.
Background
With the rapid development of Integrated Circuits (ICs), outsourcing of chip production becomes more and more strategic to save cost and shorten production cycle. However, outsourcing of chip production allows reasonable utilization of resources and capital allocation for integrated circuits, but security of integrated circuits poses a significant risk. Malicious, minor modifications to an integrated circuit chip by an untrusted third party during the outsourcing of chip production are known as hardware trojans. The hardware trojan can independently complete the attack function, such as information leakage to an attacker, circuit function change, even direct circuit damage, system function damage possibly in cooperation with software, and modification of hardware such as an Application Specific Integrated Circuit (ASIC), a microprocessor, a microcontroller, a network processor, a Digital Signal Processor (DSP) and the like can be realized, and modification of firmware such as FPGA bit streams and the like can also be realized.
Generally, people have clear knowledge about software security and have corresponding technologies to deal with the security problems related to the software. However, with the integration of chips becoming more and more sophisticated, a common chip is integrated with millions to billions of transistors, and a "hardware trojan" is largely hidden and implanted and hidden for a long time in the underlying hardware of the chip which is hardly protected, so that the hardware trojan circuit cannot be accurately positioned and analyzed like the analysis of software source code. Compared with the traditional software trojan, the hardware trojan has the following five characteristics: the concealment is strong, the hardware Trojan horse logic part is small, the system function is not influenced when the hardware Trojan horse logic part is not activated, and the hardware Trojan horse logic part is not easy to detect; the design is flexible, the action mechanism is complex, and different functions such as destruction, logic destruction, information leakage and the like can be realized; the destructive power is large, a general CPU is widely used and is generally widely used for central control of an information system, and once being attacked by a hardware Trojan horse, the general CPU can pose an important threat to national benefits; the design requirement is high, and the design of the hardware Trojan horse requires a designer to be familiar with the working principle of the system, so that the hardware design level is higher; the protection detection difficulty is high, and along with the progress of a semiconductor process, the integration level is improved, so that the hardware Trojan horse detection difficulty is higher and higher.
The emergence of hardware trojans brings great impact on the safety of the integrated circuit industry, and arouses wide attention of people. If a hardware Trojan maliciously implanted in a circuit cannot be detected, unsafe integrated circuit chips flow into the market, so that the safety of a modern information system cannot be guaranteed, and if a problem occurs, the system is likely to be paralyzed in operation, remotely controlled and the like, so that an efficient, high-precision and intelligent method is urgently needed to be found for detecting the chips, so that the detection efficiency and the accuracy are improved, the missing detection error is reduced, the comparison and detection cost of a layout image is further reduced, and the rapid detection of large-scale suspicious hardware Trojan chips is realized.
At present, the detection technology can be divided into two categories, namely destructive detection and non-destructive detection according to whether the chip to be detected is damaged or not. The nondestructive detection mainly comprises bypass analysis detection and functional test, and the hardware Trojan horse detection system detects potential hardware Trojan horses through electric signals. In the actual detection process, the hardware Trojan horse detection technology based on the electric signal has two problems: firstly, the hardware Trojan horse is high in concealment and low in activation and is difficult to detect practically; secondly, the detection system is difficult to realize the full-coverage detection of the chip. In order to solve the problem of hardware Trojan detection based on electric signals, researchers use advanced optical imaging systems to provide a hardware Trojan detection system based on reverse engineering. The system is a destructive detection method, namely, a plurality of samples are randomly extracted directly from a chip sample to be detected for dissection and separation, and then a chip image is collected for comparison detection. The destructive detection technology has high detection efficiency on the hardware Trojan horse, but the detection method has destructiveness and irreversibility on a detection chip, the reverse analysis and verification engineering quantity of the chip is large, and the cost and the time cost are too high.
In recent years, the image matching method based on feature matching is more and more widely applied in practice, and the calculated amount is greatly reduced because feature matching is firstly extracted and then is performed by the features instead of directly using pixel points in the original image for matching. However, in feature extraction, the detection of corner points and edge points and the processing of noise interference are still problematic. And due to the scale change and affine change of the image, the effect of the feature extraction method becomes a main factor of the current image matching method. Therefore, how to effectively extract key feature points of the image under different scales, different angles and different light rays, and suppressing the noise interference becomes the key for improving the performance of the image matching algorithm at present.
Disclosure of Invention
The invention aims to solve the technical problem that the prior art is not enough, and provides a registration method of an R-SIFT chip hardware Trojan horse image based on template matching.
The invention adopts the following technical scheme:
a registration method of an R-SIFT chip hardware Trojan horse image based on template matching comprises the following steps:
s1, constructing a multi-scale pyramid;
s2, performing relevant matching on the multi-scale pyramid constructed in the step S1 to obtain a coarse matching result image;
s3, carrying out Gaussian filtering and down-sampling processing on the rough matching result image obtained in the step S2 to construct a multi-scale space;
s4, detecting extreme points of the image in the different scale spaces constructed in the step S3;
s5, assigning the direction of the extreme point obtained in the step S4 to obtain the position, scale and direction information of the extreme point and generate a feature descriptor;
s6, judging whether key points in the two images are similar or not by using the Euclidean distance ratio of the nearest neighbor feature descriptors to the next nearest neighbor feature descriptors, preliminarily generating a pre-matching point pair, and expressing the pre-matching point pair by using a matching matrix;
and S7, after the pre-matching matrix is determined, correcting by using a RANSAC algorithm, removing redundant matching points and completing image matching.
Specifically, step S1 specifically includes:
s101, converting an input image I to be detected into a gray image, and processing the image to obtain a gray processed image IgPixel value I at point (x, y)g(x,y);
S102, sequentially dividing the image to be detected into a plurality of image blocks, selecting pixel points at the upper left corner in the image blocks to form a new image, continuously layering the new image, performing L times to obtain L groups of images with different resolutions from high to low, and realizing the registration of the multi-resolution images.
Specifically, step S2 specifically includes:
s201, reading in a template image and performing relevant matching on all searched images, wherein the template T (MxN pixels) is overlapped on the searched image Ig(W × H pixels) is translated, point-by-point matching is carried out, and the area of the searched graph covered by the template is a subgraph SijI, j is the coordinate of the upper left corner of the subgraph on the searched graph I; the search range is: i is more than or equal to 1 and less than or equal to W-M, and j is more than or equal to 1 and less than or equal to H-N;
s202, comparing the template T with the subgraph SijSimilarity of (c);
s203, performing four rotations on the template graph, performing template matching operation on each rotation graph and the searched graph after each rotation by 90 degrees, and selecting one rotation graph with the highest matching degree as a result to successfully position; and returning the corresponding position and size of the template on the image to be detected, and intercepting subgraphs around the position to obtain a coarse matching result I'.
Specifically, step S3 specifically includes:
s301, performing convolution operation on pixels in the rough matching result image I' and a Gaussian kernel function G (x, y, sigma) one by one to obtain a representation L (x, y, sigma) of the pixel points on a scale sigma, and performing Gaussian filtering layer by layer to form a Gaussian pyramid by the obtained images with different scales;
s302, subtracting the obtained images of the adjacent scales of the Gaussian pyramid to obtain a corresponding image D (x, y, sigma) of Gaussian difference.
Specifically, step S4 specifically includes:
s401, detecting extreme points in the image in a Gaussian difference pyramid space; taking a current point as a center, 8 points around the same plane of the current point, 9 points of a layer of corresponding points and surrounding points thereof on the current point, 9 points of a layer of corresponding points and surrounding points thereof below the current point, and 26 points in total as corresponding neighborhood points to detect whether the current point is an extreme point, and accurately positioning the extreme point by an interpolation method;
s402, substituting the value of the residual difference value point into a Taylor expansion formula;
s403, deriving the Taylor expansion to obtain a gradient, obtaining a gradient value by using an exponential weighted average ratio, and removing the characteristic points of which the gradient value is smaller than a threshold value.
Specifically, step S5 specifically includes:
s501, with the feature point as a center, rotating the image gradient position and the direction in a square neighborhood near the feature point by an angle theta, namely rotating the x axis of the original image to be the same as the main direction, reselecting the square neighborhood after rotation, and dividing the square neighborhood into 4 x 4 sub-regions at equal intervals;
s502, calculating gradient direction histograms in 8 directions in each sub-region, wherein each direction range is 0-360 degrees, and the interval size is 45 degrees; the feature descriptor of one feature point has 128 data.
Specifically, step S6 specifically includes:
s601, matching the feature vectors in the two images according to the distance, and using the ratio of the Euclidean distance of the nearest neighbor point to the Euclidean distance of the next neighbor point as the distance dratioKeeping points smaller than the threshold;
s602, sorting the obtained points according to distance to obtain a matching matrix C as follows:
Figure BDA0002750717680000051
wherein the first column elements are from the feature points in the first image, and the remaining column elements are from the feature points in the second image; m represents the number of the matching point pairs obtained in the pre-matching stage, and n represents the number of the candidate matching points reserved after pre-matching;
Figure BDA0002750717680000061
is a pre-matching point in the second image, j is 1,2, and M, k is 1,2
Figure BDA0002750717680000062
Is ordered in order of distance.
Specifically, step S7 specifically includes:
s701, selecting a sub-matrix C with the size of (p.M) multiplied by 2 at the upper left corner in the matrix C1P is a hyperparameter;
s702, in the submatrix C1In the method, three groups of matching pairs are randomly selected, and affine transformation parameters T are calculated by a calculatorθThen calculating the root mean square error between the predicted matching point and the actual matching point of the other point pairs through the radiation parameters, and keeping the root mean square error smaller than RMSEth1Recording the number of the reserved matching pairs;
s703, repeating the step S702, selecting a group of matching pairs with the most reserved matching pairs after iterating the Iter times, and calculating affine transformation parameters under the repeated group of matching pairs
Figure BDA0002750717680000063
S704, aiming at each characteristic point L in the first image1j1,2, M, by affine transformation parameters
Figure BDA0002750717680000064
Selecting n candidate points matched from the matching matrix, wherein the root mean square error of the n candidate points is minimum and smaller than RMSEth1As a final matching point; finally updating and calculating affine transformation parameters of the final matching result
Figure BDA0002750717680000065
And the root mean square error RMSE of the whole1
Compared with the prior art, the invention has at least the following beneficial effects:
the invention relates to a registration method of an R-SIFT chip hardware Trojan horse image, which realizes multi-scale and multi-resolution image registration. Firstly, coarse positioning is carried out by utilizing template matching, and R-SIFT feature matching regions are concentrated near coarse matching results. And the R-SIFT algorithm is adopted to respectively extract the local characteristics of the design layout micro-image and the suspicious chip micro-image, and the stability of a certain degree can be kept when the rotation brightness change is kept unchanged. The R-SIFT introduces an exponential weighted average ratio (Roewa), so that the interference of noise points can be further reduced, and the stability of the algorithm is improved; the method can be accurately matched with corresponding targets in the two images.
Furthermore, a multi-scale pyramid of the image to be matched is constructed, the original image is subjected to multi-scale decomposition, and similarity detection can be performed on the original image and the template image on different scales, so that multi-scale template matching is realized, the influence on a detection result due to the large difference of the scales of the two images is reduced, and the accuracy of rough matching is greatly improved.
Furthermore, a Gaussian pyramid is constructed, images of different scales are constructed while other noises are not introduced, feature points are detected on the different scales, and detail information and contour information of the images are obtained, so that the detection accuracy is improved, and the image registration stability is further improved.
Furthermore, by utilizing the rough matching result of the relevant matching, the R-SIFT feature matching region is concentrated near the suspicious chip image in a larger design drawing, and the R-SIFT feature matching is carried out aiming at smaller subproblems, so that the parallelism and the processing efficiency can be greatly improved, more effective features can be extracted, the time overhead is reduced, and the matching accuracy is improved.
Furthermore, extreme points are used as feature points, points with the most severe change and the most obvious features in the image can be selected, meanwhile, the contrast points and the edge influence are reduced, and the representativeness and the stability of the feature points can be improved as much as possible under the condition that the number of the feature points is certain.
Further, for each feature point, different gradient values are formed in different directions, so that the feature point has direction invariance. The feature descriptors containing scale, position and direction information are generated, so that the invariance of the features of the image under certain affine transformation can be met, and the registration capability of the method is improved.
Furthermore, the Euclidean distance ratio of the nearest neighbor feature descriptors to the next nearest neighbor feature descriptors is used as a similarity measurement strategy of the two feature points, mismatching caused by shielding and the like can be reduced, the number of matching points in the pre-matching process is controlled through the hyper-parameters, and the balance between the number of matching points and the matching precision is controlled more flexibly.
Further, mismatching feature points are eliminated by introducing RANSAC algorithm. Randomly selecting 3 feature points from the matched feature points, and finding out an optimal parameter model through calculation and continuous iteration, wherein the number of matched feature points is the largest in the optimal model. The mean square error of the overall matching result of the algorithm is further controlled, and the accuracy and the stability of the algorithm can be effectively improved.
In summary, the invention performs an image matching task by extracting features of different scales, different directions and different positions of an image, constructs a hardware Trojan horse image registration method based on R-SIFT under the constraint of template matching, performs rough matching through template matching, utilizes multi-scale decomposition and field information statistics weighted computation gradient, and finally performs matching and screening through a similarity strategy and RANSAC algorithm, so that the problem of reduced matching precision caused by too large value can be effectively solved, gradient abnormality caused by noise and other abnormal values can be more flexibly avoided, and the stability and precision of the image registration method can be further improved.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a basic flow diagram of the present invention;
FIG. 2 is a graph comparing experimental results of template matching in the present invention;
FIG. 3 is a graph comparing the results of R-SIFT experiments in the present invention.
Detailed Description
The invention provides a template matching-based R-SIFT chip hardware Trojan horse image registration method, which comprises the steps of firstly, carrying out multi-scale and multi-direction template matching and finding out a rough matching position of a suspicious chip image template; then the rough matching position and the suspicious chip image are subjected to R-SIFT matching,
firstly, constructing a multi-scale pyramid of an image, obtaining a rough matching image through template matching, detecting extreme points by utilizing different scale spaces of a Gaussian pyramid to obtain key feature points, and forming feature descriptors of the feature points by information contained in the feature points. The feature descriptors of the two graphs are matched using distance matching. And finally, screening out redundant points through an RANSAC algorithm to obtain a final matching result.
Referring to fig. 1, the method for registering a hardware trojan image of an R-SIFT chip of the present invention includes the following steps;
s1, constructing a multi-scale multi-pyramid
S101, firstly, converting an input image I to be detected into a gray image, and processing the image according to the following formula:
Figure BDA0002750717680000091
wherein, Ig(x, y) represents the image I after the graying processgPixel value at point (x, y), IR(x,y),IG(x,y),IB(x, y) respectively represent the pixel value sizes of the three channels RGB at the point (x, y) of the original image I.
And S102, carrying out multi-scale transformation on the grayed image to be detected in order to realize the registration of the multi-resolution image.
And sequentially dividing the image to be detected into a plurality of image blocks, selecting pixel points at the upper left corner in the image blocks to form a new image, and thus obtaining an image with lower resolution. And (4) continuously layering the new image for L times, and obtaining L groups of images with different resolutions from high to low.
S2, performing correlation matching
S201, reading in a template image and performing relevant matching on all searched images, wherein the template T (MxN pixels) is overlapped on the searched image Ig(W x H pixels) is translated, point-by-point matching is carried out, and the template covers the area of the searched image called subgraph SijI, j is the coordinate of the upper left corner of the subgraph on the searched graph I; the search range is: i is more than or equal to 1 and less than or equal to W-M1 and less than or equal to j and H-N
S202, comparing the templates T and SijThe similarity of (c). Scaling template T and subgraph SijThe following methods can be used for the matching degree of (2):
Figure BDA0002750717680000092
after being unfolded, obtain
Figure BDA0002750717680000093
The third term on the right side of the above equation represents the total energy of the template, and is a constant, and the first term, independent of (i, j), is the energy of the sub-image under the coverage of the template, which varies slowly with the (i, j) position. The second term is the cross-correlation function of the sub-image and the template, which changes rapidly with (i, j). Template T and subgraph SijThe value of this term is greatest when matching, so the degree of matching can be reflected by the following two correlation functions:
Figure BDA0002750717680000101
the formulas (1) to (3) can reflect the relative matching degree of the templates, and because the images in the experiment are complex and the template images have many similar subgraphs in the searched images, the formula (3) is selected as the formula for similarity measurement.
S203, in practice, the template image and the target in the image to be searched may have a problem of direction inconsistency, so the template image is rotated for four times, each time the template image is rotated for 90 degrees, each rotated image and the image to be searched are subjected to template matching operation, the highest matching degree is selected as a result, namely the point is considered to be matched with the template image, and the positioning is successful. And returning the corresponding position and size of the template on the image to be detected, and intercepting subgraphs around the position to obtain a coarse matching result I'.
S3, carrying out line Gaussian filtering and downsampling on the template image to construct a multi-scale space;
s301, performing convolution operation on the pixels in the rough matching result image I' and the Gaussian kernel function G (x, y, sigma) one by one to obtain the representation L (x, y, sigma) of the pixel point on the scale sigma:
L(x,y,σ)=G(x,y,σ)*I′(x,y)
wherein the Gaussian kernel function is as follows:
Figure BDA0002750717680000102
the convolution operation is carried out on the original number image by convolution kernels with different scales, so that the representation of the image on different scales can be obtained. By means of Gaussian filtering layer by layer, the obtained images with different scales can form a Gaussian pyramid.
S302, subtracting the images of the adjacent scales of the Gaussian pyramid obtained in the previous step to obtain a corresponding image D (x, y, sigma) of the Gaussian difference. Namely:
Figure BDA0002750717680000111
wherein k represents the multiple of the adjacent scale space, k is a constant, and k is 2 in the experiment.
By the formula, the response value image of the original image I under different scales can be obtained. The response value images have different scales sigma, and the Gaussian difference pyramid of the original image can be obtained by arranging the images with different scales layer by layer, namely the representation of the original image in different scale spaces.
S4, detecting extreme points of the image in different scale spaces;
s401, detecting extreme points in the image in a Gaussian difference pyramid space. The current point is taken as the center, 8 points are arranged around the same plane, 9 points are arranged on the corresponding point on the upper layer and the surrounding points, 9 points are arranged on the corresponding point on the lower layer and the surrounding points, and 26 points can be used as the neighborhood points to detect whether the points are extreme points or not. And because the detection of the extreme point in the discrete space is not accurate enough, the extreme point is accurately positioned by an interpolation method.
For the difference image D (x, y, σ) extracted in the previous step, let z be (x, y, σ)TThen the taylor expansion at the extreme point is:
Figure BDA0002750717680000112
where D is the value at the extreme point. The above equation is derived and the derivative is made 0. The results are given below:
Figure BDA0002750717680000113
wherein
Figure BDA0002750717680000114
I.e., the magnitude of the deviation of the true extreme point from the difference point, so that the final difference point is
Figure BDA0002750717680000115
The positioning accuracy of the extreme point can be improved through multiple iterations.
S402, removing some points with less obvious characteristics, such as points with low contrast, and substituting the values of the difference points into a Taylor expansion formula:
Figure BDA0002750717680000116
Figure BDA0002750717680000121
points less than a certain threshold are removed, and the threshold is set to 0.03 in this experiment.
Since the difference gaussian pyramid will generate a stronger edge response, the unstable edge response point is further removed.
S403, in a mode different from the SIFT algorithm, the gradient value is obtained by using an exponential weighted average ratio (Roewa), and the influence of abnormal noise points is avoided.
Let f (σ, x) and f (σ, y) be the first derivatives in the x-and y-directions, respectively, at the scale σ:
Figure BDA0002750717680000122
Figure BDA0002750717680000123
wherein M is1(σ,x),M2(σ,x),M1(σ, y) and M2(σ, y) is calculated by exponentially weighted average ratio, respectively, at pixel point I (a, b), M1(σ,x),M2(σ,x),M1(σ, y) and M2(σ, y) is calculated as follows:
Figure BDA0002750717680000124
Figure BDA0002750717680000125
Figure BDA0002750717680000126
Figure BDA0002750717680000127
where R represents the neighborhood range at the time of weighted averaging. The gradient magnitude f (σ, N) and direction f (σ, T) are then:
Figure BDA0002750717680000128
Figure BDA0002750717680000129
s5, assigning the direction of the feature point obtained in the previous step to obtain the position, scale and direction information of the point and generate a feature descriptor;
s501, in order to keep the feature vector having rotation invariance, it is necessary to rotate the image gradient position and direction in a certain square neighborhood around the feature point by an angle θ, that is, the x-axis of the original image is rotated to be the same as the main direction:
Figure BDA0002750717680000131
after the rotation, the square neighborhood region is re-selected and equally spaced into 4 × 4 sub-regions.
S502, gradient direction histograms of 8 directions are calculated in each sub-region, namely, each direction range is 0-360 degrees, and the interval size is 45 degrees. Therefore, for a feature point, the total number of descriptors is 4 × 4 × 8 — 128 data.
And performing Gaussian weighting processing on the feature descriptors, so that the sub-regions which are far away from the central point can obtain more weights, and the regions which are far away from the central point are endowed with smaller weights.
S6, preliminarily generating a pre-matching point pair by using a distance matching method, and representing the pre-matching point pair by using a matching matrix;
s601, matching the feature vectors in the two images according to distance, and performing pre-matching by using a nearest neighbor and next nearest neighbor feature distance ratio strategyRespectively using the ratio of the Euclidean distance between the image two and the nearest neighbor point in the image one and the Euclidean distance of the next nearest neighbor point as the adopted distance dratioThe points below a certain threshold are retained, the threshold being set to 1.0 in this experiment.
Selecting pre-matching points according to distance
Figure BDA0002750717680000132
And sorting it, namely:
Figure BDA0002750717680000133
n represents the number of candidate matching points remaining after the pre-matching.
S602, sorting the points obtained in the previous step according to distance to obtain a matching matrix C:
Figure BDA0002750717680000141
wherein the first column elements are from the feature points in the first image and the remaining column elements are from the feature points in the second image. M represents the number of matching point pairs which can be obtained after the pre-matching stage, n represents the number of candidate matching points which are reserved after the pre-matching,
Figure BDA0002750717680000142
the pre-matching points in the second image are in the order of
Figure BDA0002750717680000143
Is ordered sequentially.
And S7, after the pre-matching matrix C is determined, correcting by using a RANSAC algorithm, and removing redundant matching points.
S701, selecting a sub-matrix with the size of (p.M) multiplied by 2 at the upper left corner in the matrix C, and marking as C1P is a hyperparameter, and p is 0.5 in the experiment.
S702, in the matrix C1In the method, three groups of matching pairs are randomly selected, and affine transformation parameters T are calculated by a calculatorθThen calculating the remaining point pair prediction matches from the radiation parametersRoot mean square error between the matching points and the actual matching points, and keeping the root mean square error smaller than RMSEth1Matched pairs of (3), RMSE in this experimentth1The number of matching pairs retained this time is recorded as 1.0.
S703, repeating the previous step, after iterating the Iter for times, selecting a group of matching pairs with the largest number of reserved matching pairs in the experiment, and calculating affine transformation parameters under the repeated group of matching pairs
Figure BDA0002750717680000144
S704, aiming at each characteristic point L in the first image1j(j ═ 1, 2.. times, M), by affine transformation parameters
Figure BDA0002750717680000145
Selecting n candidate points matched from the matching matrix, wherein the root mean square error of the n candidate points is minimum and smaller than RMSEth1As a final matching point for the point; finally updating and calculating affine transformation parameters of the final matching result
Figure BDA0002750717680000146
And the root mean square error RMSE of the whole1
Referring to fig. 2 and 3, the sizes of the integrated circuit design layout to be detected and the suspected chip template image are very different, but 1578 matching points can be found after template matching and R-SIFT operation, and more than 90% of the area of the suspected chip template image finds matching points in the design layout, which shows that the invention can be accurately matched to corresponding feature positions in different images.
In conclusion, the R-SIFT chip hardware Trojan horse image registration method based on template matching realizes multi-scale and multi-resolution image registration. Firstly, constructing a multi-scale pyramid of an image, obtaining a rough matching image through template matching, and concentrating an R-SIFT feature matching region near a rough matching result; the extreme points are detected by utilizing different scale spaces of a Gaussian pyramid to obtain key feature points, and the feature points obtained by introducing an exponential weighted average ratio into R-SIFT to calculate the gradient can further reduce the interference of noise points; integrating information contained in the feature points to obtain feature descriptors of the feature points, and matching the feature descriptors of the two images by using distance matching; and finally, screening out redundant points through an RANSAC algorithm to obtain a final matching result. The method can be accurately matched with the corresponding targets in the two images.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above-mentioned contents are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereby, and any modification made on the basis of the technical idea of the present invention falls within the protection scope of the claims of the present invention.

Claims (8)

1. A registration method of an R-SIFT chip hardware Trojan horse image based on template matching is characterized by comprising the following steps:
s1, constructing a multi-scale pyramid;
s2, performing relevant matching on the multi-scale pyramid constructed in the step S1 to obtain a coarse matching result image;
s3, carrying out Gaussian filtering and down-sampling processing on the rough matching result image obtained in the step S2 to construct a multi-scale space;
s4, detecting extreme points of the image in the different scale spaces constructed in the step S3;
s5, assigning the direction of the extreme point obtained in the step S4 to obtain the position, scale and direction information of the extreme point and generate a feature descriptor;
s6, judging whether key points in the two images are similar or not by using the Euclidean distance ratio of the nearest neighbor feature descriptors to the next nearest neighbor feature descriptors, preliminarily generating a pre-matching point pair, and expressing the pre-matching point pair by using a matching matrix;
and S7, after the pre-matching matrix is determined, correcting by using a RANSAC algorithm, removing redundant matching points and completing image matching.
2. The method according to claim 1, wherein step S1 is specifically:
s101, converting an input image I to be detected into a gray scale imageProcessing the image to obtain a grayed image IgPixel value I at point (x, y)g(x,y);
S102, sequentially dividing the image to be detected into a plurality of image blocks, selecting pixel points at the upper left corner in the image blocks to form a new image, continuously layering the new image, performing L times to obtain L groups of images with different resolutions from high to low, and realizing the registration of the multi-resolution images.
3. The method according to claim 1, wherein step S2 is specifically:
s201, reading in a template image and performing relevant matching on all searched images, wherein the template T (MxN pixels) is overlapped on the searched image Ig(W × H pixels) is translated, point-by-point matching is carried out, and the area of the searched graph covered by the template is a subgraph SijI, j is the coordinate of the upper left corner of the subgraph on the searched graph I; the search range is: i is more than or equal to 1 and less than or equal to W-M, and j is more than or equal to 1 and less than or equal to H-N;
s202, comparing the template T with the subgraph SijSimilarity of (c);
s203, performing four rotations on the template graph, performing template matching operation on each rotation graph and the searched graph after each rotation by 90 degrees, and selecting one rotation graph with the highest matching degree as a result to successfully position; and returning the corresponding position and size of the template on the image to be detected, and intercepting subgraphs around the position to obtain a coarse matching result I'.
4. The method according to claim 1, wherein step S3 is specifically:
s301, performing convolution operation on pixels in the rough matching result image I' and a Gaussian kernel function G (x, y, sigma) one by one to obtain a representation L (x, y, sigma) of the pixel points on a scale sigma, and performing Gaussian filtering layer by layer to form a Gaussian pyramid by the obtained images with different scales;
s302, subtracting the obtained images of the adjacent scales of the Gaussian pyramid to obtain a corresponding image D (x, y, sigma) of Gaussian difference.
5. The method according to claim 1, wherein step S4 is specifically:
s401, detecting extreme points in the image in a Gaussian difference pyramid space; taking a current point as a center, 8 points around the same plane of the current point, 9 points of a layer of corresponding points and surrounding points thereof on the current point, 9 points of a layer of corresponding points and surrounding points thereof below the current point, and 26 points in total as corresponding neighborhood points to detect whether the current point is an extreme point, and accurately positioning the extreme point by an interpolation method;
s402, substituting the value of the residual difference value point into a Taylor expansion formula;
s403, deriving the Taylor expansion to obtain a gradient, obtaining a gradient value by using an exponential weighted average ratio, and removing the characteristic points of which the gradient value is smaller than a threshold value.
6. The method according to claim 1, wherein step S5 is specifically:
s501, with the feature point as a center, rotating the image gradient position and the direction in a square neighborhood near the feature point by an angle theta, namely rotating the x axis of the original image to be the same as the main direction, reselecting the square neighborhood after rotation, and dividing the square neighborhood into 4 x 4 sub-regions at equal intervals;
s502, calculating gradient direction histograms in 8 directions in each sub-region, wherein each direction range is 0-360 degrees, and the interval size is 45 degrees; the feature descriptor of one feature point has 128 data.
7. The method according to claim 1, wherein step S6 is specifically:
s601, matching the feature vectors in the two images according to the distance, and using the ratio of the Euclidean distance of the nearest neighbor point to the Euclidean distance of the next neighbor point as the distance dratioKeeping points smaller than the threshold;
s602, sorting the obtained points according to distance to obtain a matching matrix C as follows:
Figure FDA0002750717670000031
wherein the first column elements are from the feature points in the first image, and the remaining column elements are from the feature points in the second image; m represents the number of the matching point pairs obtained in the pre-matching stage, and n represents the number of the candidate matching points reserved after pre-matching;
Figure FDA0002750717670000032
is a pre-matching point in the second image, j is 1,2, and M, k is 1,2
Figure FDA0002750717670000033
Is ordered in order of distance.
8. The method according to claim 1, wherein step S7 is specifically:
s701, selecting a sub-matrix C with the size of (p.M) multiplied by 2 at the upper left corner in the matrix C1P is a hyperparameter;
s702, in the submatrix C1In the method, three groups of matching pairs are randomly selected, and affine transformation parameters T are calculated by a calculatorθThen calculating the root mean square error between the predicted matching point and the actual matching point of the other point pairs through the radiation parameters, and keeping the root mean square error smaller than RMSEth1Recording the number of the reserved matching pairs;
s703, repeating the step S702, selecting a group of matching pairs with the most reserved matching pairs after iterating the Iter times, and calculating affine transformation parameters under the repeated group of matching pairs
Figure FDA0002750717670000034
S704, aiming at each characteristic point L in the first image1j1,2, M, by affine transformation parameters
Figure FDA0002750717670000041
Selecting n candidate points matched from the matching matrix, wherein the root mean square error of the n candidate points is minimum and smaller than RMSEth1As a final matching point; finally updating and calculating the finalAffine transformation parameters of matching results
Figure FDA0002750717670000042
And the root mean square error RMSE of the whole1
CN202011183145.1A 2020-10-29 2020-10-29 R-SIFT chip hardware Trojan horse image registration method based on template matching Pending CN112288009A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011183145.1A CN112288009A (en) 2020-10-29 2020-10-29 R-SIFT chip hardware Trojan horse image registration method based on template matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011183145.1A CN112288009A (en) 2020-10-29 2020-10-29 R-SIFT chip hardware Trojan horse image registration method based on template matching

Publications (1)

Publication Number Publication Date
CN112288009A true CN112288009A (en) 2021-01-29

Family

ID=74352480

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011183145.1A Pending CN112288009A (en) 2020-10-29 2020-10-29 R-SIFT chip hardware Trojan horse image registration method based on template matching

Country Status (1)

Country Link
CN (1) CN112288009A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114926659A (en) * 2022-05-16 2022-08-19 上海贝特威自动化科技有限公司 Deformation target positioning algorithm based on SIFT and CM
CN114926671A (en) * 2022-06-16 2022-08-19 福州大学 NLED/QLED/OLED pixel defect positioning method and system based on template matching
CN116523727A (en) * 2023-05-23 2023-08-01 淮阴工学院 Robust watermarking algorithm based on SIFT and image splitting

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109818A (en) * 2006-07-20 2008-01-23 中国科学院自动化研究所 Method for automatically selecting remote sensing image high-precision control point
CN101667293A (en) * 2009-09-24 2010-03-10 哈尔滨工业大学 Method for conducting high-precision and steady registration on diversified sensor remote sensing images
CN103593832A (en) * 2013-09-25 2014-02-19 重庆邮电大学 Method for image mosaic based on feature detection operator of second order difference of Gaussian
US9245201B1 (en) * 2013-03-15 2016-01-26 Excelis Inc. Method and system for automatic registration of images
CN107240130A (en) * 2017-06-06 2017-10-10 苍穹数码技术股份有限公司 Remote Sensing Image Matching method, apparatus and system
CN107274441A (en) * 2017-06-14 2017-10-20 中国科学院遥感与数字地球研究所 The wave band calibration method and system of a kind of high spectrum image
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
CN109409292A (en) * 2018-10-26 2019-03-01 西安电子科技大学 The heterologous image matching method extracted based on fining characteristic optimization
CN110020692A (en) * 2019-04-13 2019-07-16 南京红松信息技术有限公司 A kind of handwritten form separation and localization method based on block letter template
CN110111372A (en) * 2019-04-16 2019-08-09 昆明理工大学 Medical figure registration and fusion method based on SIFT+RANSAC algorithm

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101109818A (en) * 2006-07-20 2008-01-23 中国科学院自动化研究所 Method for automatically selecting remote sensing image high-precision control point
CN101667293A (en) * 2009-09-24 2010-03-10 哈尔滨工业大学 Method for conducting high-precision and steady registration on diversified sensor remote sensing images
US9245201B1 (en) * 2013-03-15 2016-01-26 Excelis Inc. Method and system for automatic registration of images
CN103593832A (en) * 2013-09-25 2014-02-19 重庆邮电大学 Method for image mosaic based on feature detection operator of second order difference of Gaussian
CN107240130A (en) * 2017-06-06 2017-10-10 苍穹数码技术股份有限公司 Remote Sensing Image Matching method, apparatus and system
CN107274441A (en) * 2017-06-14 2017-10-20 中国科学院遥感与数字地球研究所 The wave band calibration method and system of a kind of high spectrum image
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
CN109409292A (en) * 2018-10-26 2019-03-01 西安电子科技大学 The heterologous image matching method extracted based on fining characteristic optimization
CN110020692A (en) * 2019-04-13 2019-07-16 南京红松信息技术有限公司 A kind of handwritten form separation and localization method based on block letter template
CN110111372A (en) * 2019-04-16 2019-08-09 昆明理工大学 Medical figure registration and fusion method based on SIFT+RANSAC algorithm

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
吴丹力等: "《一种基于仿射变换的SIFT误匹配剔除方法》", 《福州大学学报(自然科学版)》 *
吴丹力等: "《一种基于仿射变换的SIFT误匹配剔除方法》", 《福州大学学报(自然科学版)》, vol. 45, no. 1, 28 February 2017 (2017-02-28), pages 1 *
李冲等: "《测绘地理信息成果信息化质检平台构建技术研究》", 31 March 2019, 武汉大学出版社, pages: 334 - 338 *
李凯悦等: "《InSAR成像匹配制导技术》", 《导航定位与授时》, vol. 1, no. 2, pages 19 - 24 *
王殊轶等: "《基于机器视觉的手术刀片尺寸测量系统设计研究》", 《微计算机信息》, vol. 26, no. 12, pages 19 - 21 *
袁修孝等: "《基于POS 数据的核线影像生成方法》", 《武汉大学学报 信息科学版》, vol. 33, no. 6, pages 560 - 564 *
许 东等: "《基于图像对数极坐标变换的多分辨率相关匹配算法》", 《西北工业大学学报》, vol. 22, no. 5, pages 653 - 656 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114926659A (en) * 2022-05-16 2022-08-19 上海贝特威自动化科技有限公司 Deformation target positioning algorithm based on SIFT and CM
CN114926659B (en) * 2022-05-16 2023-08-08 上海贝特威自动化科技有限公司 Deformation target positioning algorithm based on SIFT and CM
CN114926671A (en) * 2022-06-16 2022-08-19 福州大学 NLED/QLED/OLED pixel defect positioning method and system based on template matching
CN116523727A (en) * 2023-05-23 2023-08-01 淮阴工学院 Robust watermarking algorithm based on SIFT and image splitting

Similar Documents

Publication Publication Date Title
CN112288009A (en) R-SIFT chip hardware Trojan horse image registration method based on template matching
CN108345827B (en) Method, system and neural network for identifying document direction
Bouris et al. Fast and efficient FPGA-based feature detection employing the SURF algorithm
EP3596449A1 (en) Structure defect detection using machine learning algorithms
Li et al. Automatic crack detection and measurement of concrete structure using convolutional encoder-decoder network
EP2534612B1 (en) Efficient scale-space extraction and description of interest points
Yang et al. ConDinet++: Full-scale fusion network based on conditional dilated convolution to extract roads from remote sensing images
Xiang et al. A crack-segmentation algorithm fusing transformers and convolutional neural networks for complex detection scenarios
CN102509293B (en) Method for detecting consistency of different-source images
Khalil et al. Text detection and script identification in natural scene images using deep learning
CN112102381A (en) Hardware Trojan horse image registration method based on R-SIFT, storage medium and equipment
Qin et al. Arbitrary shape natural scene text detection method based on soft attention mechanism and dilated convolution
CN113158789A (en) Target detection method, system, device and medium for remote sensing image
Gong et al. Context–content collaborative network for building extraction from high-resolution imagery
Zhang et al. Rotationally unconstrained region proposals for ship target segmentation in optical remote sensing
CN107993230B (en) Image tampering detection method based on triangular mesh comprehensive characteristics
Jin et al. Registration of UAV images using improved structural shape similarity based on mathematical morphology and phase congruency
Wang et al. Scale‐invariant feature matching based on pairs of feature points
Zhao et al. YOLOv5-Sewer: Lightweight Sewer Defect Detection Model
Huh et al. Efficient framework for automated classification of subcellular patterns in budding yeast
Elhariri et al. Performance analysis of using feature fusion for crack detection in images of historical buildings
CN116206139A (en) Unmanned aerial vehicle image upscaling matching method based on local self-convolution
Du et al. An automatic positioning algorithm for archaeological fragments
Du et al. TextEdge: Multi-oriented scene text detection via region segmentation and edge classification
Koutaki et al. Fast and high accuracy pattern matching using multi-stage refining eigen template

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination