CN113674174A - Line scanning cylinder geometric correction method and device based on significant row matching - Google Patents

Line scanning cylinder geometric correction method and device based on significant row matching Download PDF

Info

Publication number
CN113674174A
CN113674174A CN202110965505.1A CN202110965505A CN113674174A CN 113674174 A CN113674174 A CN 113674174A CN 202110965505 A CN202110965505 A CN 202110965505A CN 113674174 A CN113674174 A CN 113674174A
Authority
CN
China
Prior art keywords
line
image
matching
corrected
reference image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110965505.1A
Other languages
Chinese (zh)
Other versions
CN113674174B (en
Inventor
贺永刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Prism Space Intelligent Technology Co ltd
Original Assignee
Ningbo Prism Space Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Prism Space Intelligent Technology Co ltd filed Critical Ningbo Prism Space Intelligent Technology Co ltd
Priority to CN202110965505.1A priority Critical patent/CN113674174B/en
Publication of CN113674174A publication Critical patent/CN113674174A/en
Application granted granted Critical
Publication of CN113674174B publication Critical patent/CN113674174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a line scanning cylinder geometric correction method and a device based on salient line matching, wherein the method comprises the following steps: step S1, using a line scan camera to perform line scan on a cylindrical workpiece to be corrected, acquiring an expanded image as an image to be corrected, and acquiring a reference image; step S2, carrying out gradient projection on the reference image and the image to be corrected; step S3, significant line selection is carried out according to the gradient projection value; step S4, for each significant line, searching a candidate matching line corresponding to the significant line on the image to be corrected; step S5, sampling the matching line pairs to obtain q matching line pairs, and determining the optimal sampling matching line pair through multiple iterations; step S6, with the optimal sampling matching line pair as reference, allocating a matching line on the image to be corrected for each salient line on the reference image; and step S7, transforming the image to be corrected by using the matching lines to obtain an image aligned with the reference image, thereby realizing geometric correction.

Description

Line scanning cylinder geometric correction method and device based on significant row matching
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method and a device for geometrically correcting a line-scanning cylinder based on significant row matching.
Background
In the industrial field, in order to detect surface defects of some cylindrical workpieces, it is necessary to image the surface thereof. Common applications include surface defect detection of lipsticks, cylindrical pencils, and the like.
The main methods currently used for imaging cylinders are: the workpiece is placed on a mechanism capable of enabling the workpiece to rotate, a line scanning camera is fixed above the cylindrical workpiece, the cylindrical workpiece is rotated at a constant speed through the mechanism, and signals are sent to the line scanning camera to acquire images. However, the cylindrical imaging acquired in this way presents two problems:
a. the starting position of the cylinder deployment differs each time.
The reason is as follows: the position of the face-to-face line scanning camera corresponds to the starting position of the unfolding when the workpiece is placed, and in fact, the cylindrical workpiece is difficult to be placed at the same position every time.
b. The cylindrical image is distorted in the direction of the unfolding.
The reason is as follows: since the rotation angular velocity of the mechanism is not absolutely uniform, the image obtained by the line scan camera is distorted. In this way the cylindrical unfolded image will be elongated or shortened to different degrees and the stretching is non-linear.
Based on these two problems, cylindrical workpieces are difficult to align using conventional image matching techniques as planar workpiece imaging, and thus large-scale inspection using machine vision is not possible.
In order to solve the above problems, there is also a technique of reducing image distortion by making the angular velocity more uniform by using an ultra-high precision mechanism device, but it is difficult to popularize at an excessively high cost.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a geometric correction method and a geometric correction device for a line-scanned cylinder based on significant line matching, which achieve the aim of aligning a cylinder image obtained by line scanning to a reference image based on significant line matching, so that the same batch of cylinder materials can be aligned reasonably, and the cylinder materials can also be subjected to large-scale visual defect detection like plane materials.
To achieve the above and other objects, the present invention provides a method for correcting a geometric shape of a line-scan cylinder based on saliency matching, comprising the steps of:
step S1, acquiring an unfolded image of a perfect and undamaged cylindrical workpiece by using a line-scan camera, preprocessing the image to obtain an ideal image as a reference image, and performing line-scan acquisition on the unfolded image of the cylindrical workpiece to be corrected by using the line-scan camera as an image to be corrected;
step S2, respectively carrying out gradient projection on the reference image and the image to be corrected;
step S3, significant line selection is carried out on the reference image and the image to be corrected according to the gradient projection value;
step S4, for each significant line of the reference image, finding a candidate matching line corresponding to the significant line on the image to be corrected;
step S5, sampling the matching line pairs obtained in the step S4 to obtain q matching line pairs, correcting the vertical coordinate of each significant line on the image to be corrected according to the significant line reference of the reference image of each matching line pair, searching whether a significant line exists in the corresponding range on the reference image according to the correction result, updating the confidence coefficient of the sampling matching line pairs in the current group according to the search result, and determining the optimal sampling matching line pair through multiple iterations and according to the confidence coefficient;
step S6, with the optimal sampling matching line pair as reference, allocating a matching line on the image to be corrected for each salient line on the reference image;
and step S7, transforming the image to be corrected by using the obtained matching lines to obtain an image aligned with the reference image, thereby realizing geometric correction.
Preferably, in step S1, if the side of the cylinder changes along the Y direction of the image, the width and height of the reference image are the same and different from those of the image to be corrected; if the side surface of the cylinder changes along the X direction of the image, the height of the reference image is the same as that of the image to be corrected, and the width of the reference image is different from that of the image to be corrected.
Preferably, in step S1, the cylinder is expanded in the Y direction by default, and if the cylinder is expanded in the X direction, the expanded image is rotated by 90 degrees.
Preferably, in step S2, the reference image and the image to be corrected are respectively subjected to gradient calculation along the Y direction of the image, and horizontal projection is performed to obtain a column vector with a length equal to the image height.
Preferably, in step S3, the line with the gradient projection value greater than the predetermined threshold is used as the saliency line.
Preferably, in step S4, the ith salient line S in the reference image is determinedR(i) And the kth significant line S in the image to be correctedW(k) The match response M (i, k) is calculated as:
Figure BDA0003223786840000031
and selecting p significant lines with the minimum matching response as candidate matching lines according to the matching response values of the significant lines and the significant lines in the image to be corrected, wherein R (S)R(i) J) represents the i-th salient line S of the reference image RR(i) Gray value of upper j point, W (S)W(k) J denotes the kth significant line S of the image W to be correctedW(k) The gray value of the upper j point.
Preferably, the step S5 further includes the steps of:
step S500, randomly extracting q different salient lines from the salient lines in the reference image, and sorting the salient lines from small to large according to the vertical coordinate;
step S501, for each extracted significant line, extracting matching lines randomly from p candidate matching lines according to probability;
step S502, for q significant rows in the reference image, if the ordinate of the ith significant row is smaller than the ordinate of the jth significant row (i, j ∈ {1,2, …, q } and i ≠ j), then the matching row extracted from the image to be corrected must also satisfy the condition, otherwise, the current sampling is discarded and the step S500 is executed again;
step S503, according to q sampled matching line pairs, taking the sampling line of the reference image as a reference, correcting the ordinate of each significant line on the image to be corrected, searching whether significant lines exist in the corresponding range on the reference image according to the correction result, and updating the confidence coefficient of the sampling matching pair in the current group according to the search result;
and step S504, returning to the step S500, iterating to the set iteration times, and saving the matching line pair with the maximum confidence coefficient as the optimal sampling matching line pair.
Preferably, in step S501, the following operations are performed:
step 1: randomly generating 1 integer between 1 and p, wherein the integer corresponds to the selected candidate matching row;
step2, randomly generating 1 value with the value range of [0,1], and when the value is less than the probability of the candidate matching line, the current candidate matching line is a sampled matching line; otherwise, step1 is re-executed until a matching line is selected that meets the condition.
Preferably, step S503 further includes:
after each group of sampling matching pairs is obtained, the confidence coefficient of the matching pair is initially 0;
correcting the ordinate of each salient line on the image to be corrected according to the sampling behavior standard of the reference image;
and after the corrected vertical coordinate is obtained, searching whether a remarkable line exists in a corresponding range on the reference image, and if so, increasing the current confidence by 1.
Preferably, in step S6, the position of each significant line of the reference image is transformed to obtain the corresponding position of the image to be corrected, and the best corresponding line is found by performing the matching again near the corresponding position.
In order to achieve the above object, the present invention further provides a line-scan cylinder geometry correction apparatus based on saliency row matching, including:
the image acquisition and processing unit is used for acquiring an unfolded image of a perfect and undamaged cylindrical workpiece by using the line-scan camera, preprocessing the image to obtain an ideal image as a reference image, and performing line-scan acquisition and unfolding on the cylindrical workpiece to be corrected by using the line-scan camera to obtain the unfolded image as an image to be corrected;
the gradient projection unit is used for respectively carrying out gradient projection on the reference image and the image to be corrected;
the salient line selection unit is used for carrying out salient line selection on the reference image and the image to be corrected according to the gradient projection value;
the rough matching unit is used for searching a candidate matching line corresponding to each salient line of the reference image on the image to be corrected;
the sampling correction optimizing unit is used for sampling the matching line pairs obtained by the coarse matching unit to obtain q matching line pairs, correcting the vertical coordinate of each significant line on the image to be corrected according to the significant behavior reference of the reference image of each matching line pair, searching whether a significant line exists in the corresponding range on the reference image according to the correction result, updating the confidence coefficient of the sampling matching line pairs in the current group according to the search result, and determining the optimal sampling matching line pair through multiple iterations and according to the confidence coefficient;
the fine matching unit is used for taking the optimal sampling matching line pair as a reference and distributing a matching line on the image to be corrected for each salient line on the reference image;
and the geometric correction unit is used for transforming the image to be corrected by using the obtained matching lines to obtain an image aligned with the reference image, so that geometric correction is realized.
Compared with the prior art, the invention discloses a geometric correction method and a device of a line-scanning cylinder based on salient line matching, which obtains a reference image and an image to be corrected by using a line-scanning camera, then respectively carries out gradient projection on the reference image and the image to be corrected, carries out salient line selection on the reference image and the image to be corrected according to a gradient projection value, searches a candidate matching line corresponding to each salient line on the image to be corrected on each salient line of the reference image, samples the obtained matching line pairs to obtain q matching line pairs, corrects the longitudinal coordinate of each salient line on the image to be corrected according to the salient line reference of the reference image of each matching line pair, searches whether the salient line exists in a corresponding range on the reference image according to a correction result, and updates the confidence coefficient of the current group of sampled matching line pairs according to the search result, determining an optimal sampling matching line pair according to the confidence degree through multiple iterations, allocating a matching line on the image to be corrected for each significant line on the reference image by taking the optimal sampling matching line pair as a reference, and finally transforming the image to be corrected by using the obtained matching line pair to obtain an image aligned with the reference image, so that the purpose of aligning the cylindrical image obtained by line scanning to the reference image is realized based on the significant line matching, the same batch of cylindrical materials can be aligned reasonably, and the cylindrical materials can also be subjected to large-scale visual defect detection like plane materials.
Drawings
FIG. 1 is a flow chart illustrating the steps of a method for correcting the geometry of a line-scan cylinder based on saliency row matching according to the present invention;
FIG. 2 is a flow chart of line-scan cylinder geometry correction based on saliency row matching in an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a line-scan cylinder geometry correction apparatus based on saliency row matching according to the present invention.
Detailed Description
Other advantages and capabilities of the present invention will be readily apparent to those skilled in the art from the present disclosure by describing the embodiments of the present invention with specific embodiments thereof in conjunction with the accompanying drawings. The invention is capable of other and different embodiments and its several details are capable of modification in various other respects, all without departing from the spirit and scope of the present invention.
Fig. 1 is a flowchart illustrating steps of a method for correcting a geometric correction of a line-scan cylinder based on saliency row matching according to an embodiment of the present invention, and fig. 2 is a flowchart illustrating geometric correction of a line-scan cylinder based on saliency row matching according to an embodiment of the present invention. As shown in fig. 1 and fig. 2, the method for correcting the geometric shape of a line-scan cylinder based on saliency row matching of the present invention includes the following steps:
and step S1, acquiring an unfolded image of a perfect cylinder workpiece by using the line-scan camera, preprocessing the image to obtain an ideal image as a reference image, and acquiring the unfolded image as an image to be corrected by using the line-scan camera to perform line scanning on the cylinder workpiece to be corrected.
In the invention, a line scan camera is used for performing line scan on a cylindrical workpiece to acquire an unfolded image, and the acquired image is an image to be corrected and marked as W.
Similarly, a linear scanning camera is used for acquiring an unfolded image of a perfect cylinder workpiece, and a final desired ideal image is obtained through operations of manual translation, cutting and the like and is used as a reference image and is recorded as R. The reference image contains exactly the unfolded content of the cylindrical workpiece, and the image to be corrected contains the repeated cylindrical side content. The two differ only in the direction of deployment, but the content of the imaging does not change. The invention aims to correct an image to be corrected by taking a reference image as a reference.
In the embodiment of the invention, if the side surface of the cylinder changes along the Y direction of the image, the width and the height of the reference image are the same and different from those of the image to be corrected; if the side surface of the cylinder changes along the X direction of the image, the height of the reference image is the same as that of the image to be corrected, and the width of the reference image is different from that of the image to be corrected.
For convenience of description, it is assumed in the present invention that the cylindrical side surface is expanded and then varies along the Y direction of the image, and the heights of the reference image and the image to be corrected are respectively denoted as hRAnd hWThe widths of the reference image and the image to be corrected are the same and are collectively denoted as w. The operation is similar when the cylinder side is unfolded and then changed in the X direction of the image.
Step S2, performing gradient projection on the reference image and the image to be corrected respectively.
Specifically, the reference image and the image to be corrected are respectively subjected to gradient calculation along the Y direction of the image, and horizontal projection is performed to obtain a column vector with the length being the image height.
Taking the reference image R as an example, the ith row gradient projection PR(i) Comprises the following steps:
Figure BDA0003223786840000071
the gradient projection calculation method for the image to be corrected is the same, and is not described herein again.
It should be noted that, in the present invention, the cylinder is developed in the Y direction by default, and if the cylinder is developed in the X direction, the developed image needs to be rotated by 90 degrees, so as to ensure that there is no difference between the reference image and the image content to be corrected in the X direction, and there is a difference in expansion and contraction only in the Y direction.
And step S3, performing salient line selection on the reference image and the image to be corrected according to the gradient projection value.
Since the line-by-line matching is not only computationally intensive, but also results in numerous mismatches due to the presence of a large number of insignificant lines. Therefore, the present invention screens out relatively significant rows. In the embodiment of the present invention, the threshold h is manually settThe gradient projection value is larger than a threshold value htThe row of (2) is taken as the saliency row.
When the significant line selection is executed on the reference image and the image to be corrected, a group of display line numbers are obtained and recorded as SRAnd SWThe number of elements is m and n. The ordinate value for the ith salient line in the reference image is denoted as yR(SR(i) Y) for the y-th significant line of the image to be correctedW(SW(j) ). The storage of the salient rows needs to be sorted from small to large by the ordinate.
Step S4, for each significant line of the reference image, finding a candidate matching line corresponding to the significant line on the image to be corrected.
The step of rough matching is to match each salient line of the reference image with each salient line of the image to be corrected, and to search p salient lines with the minimum matching response on the image to be corrected as candidate matching lines. Specifically, the ith salient line S in the reference image is divided intoR(i) With the kth significant line S in the image to be correctedW(k) The match response M (i, k) is calculated as:
Figure BDA0003223786840000081
R(SR(i) j) represents the i-th salient line S of the reference image RR(i) Gray value of upper j point, W (S)W(k) J denotes the kth significant line S of the image W to be correctedW(k) The gray value of the upper j point is,
and selecting p significant rows with the minimum matching response as candidate matching rows according to the matching response values of the significant rows in the reference image and the significant rows in the image to be corrected, wherein a parameter p is set by a user and is usually 2-4, normalizing (dividing by the total sum) the p matching responses to make the sum of the p matching responses be 1, and subtracting the p normalized values by 1 to obtain the matching probability of each candidate matching row. In the present invention, the 1 is used to ensure that the smaller the matching response, the higher the probability. Examples are as follows: assuming the ith significant line on the reference image, 2(p ═ 2) minimum matching response behaviors a and b on the image to be corrected, whose specific response values are M (i, a) ═ 100 and M (i, b) ═ 300, then after normalization, the two values are 100/(100+300) ═ 0.25,300/(100+300) ═ 0.75, so that the matching probability is 1-0.25 ═ 0.75 and 1-0.75 ═ 0.25, so that the matching probability of the matching response row a is 0.75 and the matching probability of b is 0.25, which means that the a row on the image to be corrected and the i row of the reference image match with a high probability and a probability of 0.75.
And step S5, sampling the matching line pairs obtained in the step S4 to obtain q matching line pairs, and correcting the ordinate of the corresponding significant line on the image to be corrected according to the significant line standard of the reference image of each matching line pair.
After the rough matching in step S4, each significant row of the reference image and the corresponding candidate matching row in the image to be corrected form a matching row pair, and the matching row pair obtained by the rough matching has different degrees of mismatching, but there are a large number of correct matching row pairs, so it is necessary to further determine the reliability of each matching row pair. In the embodiment of the invention, the matching line pair is sampled in a random sampling mode, and the reliability of the matching line pair is judged. The sampling for one time needs to be performed for multiple times until a proper matching row pair is acquired, and the step S5 specifically includes the following steps:
step S500, randomly extracting q different salient lines from the salient lines in the reference image as sampling lines, and sorting the sampling lines from small to large according to the ordinate. The parameter q is set by the user, and its value is related to the actually extracted saliency line and the imaging quality. When the number of the remarkable rows is large and the imaging quality is good, the number of the remarkable rows can be set to be larger, and the number of the remarkable rows needs to be smaller, and the value range is usually 5-30.
Step S501, for each extracted sampling line, randomly extracting matching lines from p candidate matching lines according to probability, wherein the specific operation is as follows:
step 1: randomly generating 1 integer with the value range of 1,2, …, p, wherein the value corresponds to the selected candidate matching row;
step2, randomly generating 1 value with the value range of [0,1], and when the value is less than the matching probability of the candidate matching row, the current candidate matching row is a sampled matching row; otherwise, step1 is re-executed until a matching line is selected that meets the condition.
Step S502, verifying the sequence consistency. For q significant lines (sampling lines) in the reference image, if the ordinate of the ith significant line is smaller than the ordinate of the jth significant line (i, j ∈ {1,2, …, q } and i ≠ j), the matching line extracted on the image to be corrected must also satisfy the condition, otherwise, the sampling is discarded and the step S500 is executed again.
After the sampling matching pair operation is performed, q matching row pairs are obtained.
Step S503, according to the q sampled matching line pairs, taking the sampling line of the reference image as a reference, correcting the ordinate of each significant line on the image to be corrected, searching whether significant lines exist in the corresponding range on the reference image according to the correction result, and updating the confidence of the current group of sampling matching pairs according to the search result.
Specifically, after each set of sample matching pairs (q matching row pairs are a set), the confidence of the initial set of sample matching pairs is 0. Suppose q matching line pairs obtained by sampling, where the ordinate of the q salient lines on the reference image is noted as
Figure BDA0003223786840000101
The ordinate of the matching line corresponding thereto on the image W to be corrected is recorded as
Figure BDA0003223786840000102
The ordinate of each salient line on the image to be corrected is corrected with reference to the sampling line (salient line) of the reference image (i.e. all salient lines are used to verify whether the currently selected set of matching line pairs is reasonable). With the jth significant row SW(j) For example, its original ordinate is yW(SW(j) Corrected ordinate is
Figure BDA0003223786840000103
a. When in use
Figure BDA0003223786840000104
When the temperature of the water is higher than the set temperature,
Figure BDA0003223786840000105
b. when in use
Figure BDA0003223786840000106
When k is 1,2, …, q-1
Figure BDA0003223786840000107
c. When in use
Figure BDA0003223786840000108
Time of flight
Figure BDA0003223786840000109
After obtaining the corrected ordinate
Figure BDA00032237868400001010
Then, corresponding range on the reference image
Figure BDA00032237868400001011
Figure BDA00032237868400001012
And if the inner search is carried out, increasing the current confidence coefficient by 1, wherein rtThe parameters set for the user are generally 2-5 rtThe smaller the value, the more stringent the determination, rtThe larger the value, the more relaxed the decision.
Step S504 returns to step S500 to perform iteration to the set iteration number (for example, 100), and a group of matching line pairs with the highest confidence is saved as the optimal sampling matching line pair, that is, a best group of matching groups is selected from all the matching groups, and each group is q matching line pairs.
In step S6, a matching line is allocated to each significant line on the reference image on the image to be corrected, with the optimal sampling matching line pair as a reference.
Suppose that the optimal sampled matched row pair is obtained by iterative optimization, where the ordinate of the q optimal matched rows on the reference image is noted as
Figure BDA0003223786840000111
The ordinate of the best matching line corresponding to the image W to be corrected is recorded as
Figure BDA0003223786840000112
And (4) carrying out matching again near each salient line position of the reference image to find the best corresponding line according to the corresponding position of the image to be corrected obtained by transformation. With the ith salient line S on the reference imageR(i) For example, its original ordinate is yR(SR(i) On the image to be corrected, the ordinate corresponding thereto is
Figure BDA0003223786840000113
(note that, here, instead of operating on the original reference image, the coordinates on the reference image are transformed to the coordinates on the image to be corrected, so its subscript is set to the reference image R and its superscript is set to the image to be corrected W):
a. when in use
Figure BDA0003223786840000114
When the temperature of the water is higher than the set temperature,
Figure BDA0003223786840000115
b. when in use
Figure BDA0003223786840000116
When k is 1,2, …, q-1
Figure BDA0003223786840000117
c. When in use
Figure BDA0003223786840000118
Time of flight
Figure BDA0003223786840000119
For salient lines S on the reference imageR(i) Obtaining the corresponding position of the image to be corrected
Figure BDA00032237868400001110
After, in the range
Figure BDA00032237868400001111
Interior re-match search and SR(i) Corresponding optimal matching row, let k be
Figure BDA00032237868400001112
Integer in range, match response value of
Figure BDA00032237868400001113
Figure BDA00032237868400001114
Using the line with the minimum response value as the salient line S of the reference imageR(i) In the corresponding line of the image to be corrected, it is marked as S 'for convenience of description'W(i) With ordinate yW(S′W(i))。
And step S7, transforming the image to be corrected by using the obtained matching lines to obtain an image aligned with the reference image, thereby realizing geometric correction.
In the present invention, the line position u' of the u-th line of the aligned image C on the image W to be corrected is:
a. when u is<yR(SR(1) In the case of a hot press machine),
the proportional relationship between the ordinate positions of the 1 st and 2 nd significant lines is used to estimate the line u' on the image W to be corrected corresponding to it:
Figure BDA0003223786840000121
wherein, u' -yW(S′W(1) Denotes the u 'th to y' th lines on the image WW(S′W(1) Distance of rows); u-yR(SR(1) Represents the u-th to y-th lines on the image RR(SR(1) Distance of rows); y isW(S′W(1))-yW(S′W(2) Y on the image WW(S′W(1) Row y to yW(S′W(2) Distance of rows, yR(SR(1))-yR(SR(2) Y on the image R)R(SR(1) Row y to yR(SR(2) The distance of the rows, the present invention scales the distance of the adjacent rows to obtain the above equation.
b. When y isR(SR(i))≤u<yR(SR(i +1)), i is 1,2, … q-1
The proportional relation between the vertical coordinate positions of the ith and (i +1) th remarkable rows is used for estimating the row u' corresponding to the ith remarkable row on the image W to be corrected:
Figure BDA0003223786840000122
accordingly, u' -yW(S′W(i) Denotes the u 'th to y' th lines on the image WW(S′W(i) Distance of rows); u-yR(SR(i) Represents the u-th to y-th lines on the image RR(SR(i) Distance of rows); y isW(S′W(i))-yW(S′W(i +1)) represents the y-th on the image WW(S′W(i) Row y to yW(S′W(i +1)) distance of rows, yR(SR(i))-yR(SR(i +1)) represents the y-th on the image RR(SR(i) Row y to yR(SR(i +1)) row distance.
c. When u ≧ yR(SR(q)) in the presence of a catalyst,
the line u' corresponding to the q-1 th and the q-th significant line ordinate positions on the image W to be corrected is presumed by means of the proportional relationship:
Figure BDA0003223786840000131
accordingly, u' -yW(S′W(q)) represents the u 'th to y' th lines on the image WW(S′W(q)) a distance of the rows; u-yR(SR(q)) represents the u-th to y-th lines on the image RR(SR(q)) a distance of the rows; y isW(S′W(q))-yW(S′W(q-1)) represents the y-th on the image WW(S′W(q)) line to the y-thW(S′W(q-1)) distance of rows, yR(SR(q))-yR(SR(q-1)) represents the y-th image on the image RR(SR(q)) line to the y-thR(SR(q-1)) distance of rows.
When the corresponding relationship is obtained, the line u' may be a small value, and the gray values of all data of the u-th line in the alignment image C need to be obtained through linear interpolation:
Figure BDA0003223786840000141
wherein the content of the first and second substances,
Figure BDA0003223786840000142
on the presentation image W
Figure BDA0003223786840000143
V is the abscissa with the value 1,2, …, w, w is the width of the reference image and the image to be corrected.
Fig. 3 is a schematic structural diagram of a line-scan cylinder geometry correction apparatus based on saliency row matching according to the present invention. As shown in fig. 3, the present invention provides a line-scan cylinder geometry correction apparatus based on saliency row matching, including:
the image acquisition and processing unit 201 is configured to acquire an unfolded image of an intact cylindrical workpiece by using the line-scan camera, perform preprocessing to obtain an ideal image as a reference image, and perform line-scan acquisition on the unfolded image of the cylindrical workpiece to be corrected by using the line-scan camera as an image to be corrected.
In the present invention, the image acquisition processing unit 201 performs line scanning on the cylindrical workpiece by using the line scan camera to acquire an expanded image, and the acquired image is an image to be corrected and is denoted as W.
Similarly, the image acquisition processing unit 201 acquires an unfolded image of a perfect cylinder workpiece by using the line scan camera, and obtains a final desired ideal image as a reference image, which is denoted as R, through operations such as manual translation and cropping. The reference image contains exactly the unfolded content of the cylindrical workpiece, and the image to be corrected contains the repeated cylindrical side content. The two differ only in the direction of deployment, but the content of the imaging does not change. The invention aims to correct an image to be corrected by taking a reference image as a reference.
In the embodiment of the invention, if the side surface of the cylinder changes along the Y direction of the image, the width and the height of the reference image are the same and different from those of the image to be corrected; if the side surface of the cylinder changes along the X direction of the image, the height of the reference image is the same as that of the image to be corrected, and the width of the reference image is different from that of the image to be corrected.
For convenience of description, it is assumed in the present invention that the cylindrical side surface is expanded and then varies along the Y direction of the image, and the heights of the reference image and the image to be corrected are respectively denoted as hRAnd hWThe widths of the reference image and the image to be corrected are the same and are collectively denoted as w. When the side of the cylinder is unfolded and then changed along the X direction of the image, the operation mode is similar
A gradient projection unit 202, configured to perform gradient projection on the reference image and the image to be corrected respectively.
Specifically, the gradient projection unit 202 calculates gradients of the reference image and the image to be corrected along the Y direction of the image, and performs horizontal projection to obtain a column vector with a high length.
Taking the reference image R as an example, the ith row gradient projection PR(i) Comprises the following steps:
Figure BDA0003223786840000151
the gradient projection calculation method for the image to be corrected is the same, and is not described herein again.
It should be noted that, in the present invention, the cylinder is developed in the Y direction by default, and if the cylinder is developed in the X direction, the developed image needs to be rotated by 90 degrees, so as to ensure that there is no difference between the reference image and the image content to be corrected in the X direction, and there is a difference in expansion and contraction only in the Y direction.
And the salient line selection unit 203 is used for performing salient line selection on the reference image and the image to be corrected according to the gradient projection value.
Because, the line-by-line matching is not only computationally intensiveAnd many mismatches can occur because there are a large number of insignificant rows. Therefore, the present invention screens out relatively significant rows. In the embodiment of the present invention, the threshold h is manually settThe gradient projection value is larger than a threshold value htThe row of (2) is taken as the saliency row.
When the significant line selection is executed on the reference image and the image to be corrected, a group of display line numbers are obtained and recorded as SRAnd SWThe number of elements is m and n. The ordinate value for the ith salient line in the reference image is denoted as yR(SR(i) Y) for the y-th significant line of the image to be correctedW(SW(j) ). The storage of the salient rows needs to be sorted from small to large by the ordinate.
And a rough matching unit 204, configured to find, for each significant line of the reference image, a candidate matching line corresponding to the significant line on the image to be corrected.
Specifically, the rough matching unit 204 matches each significant line of the reference image with each significant line of the image to be corrected, and finds p significant lines with the smallest matching response on the image to be corrected as candidate matching lines. Specifically, the ith salient line S in the reference image is divided intoR(i) With the kth significant line S in the image to be correctedW(k) The match response M (i, k) is calculated as:
Figure BDA0003223786840000161
and therefore, for each salient line of the reference image, selecting p salient lines with the minimum matching response as candidate matching lines according to the matching response values of the salient lines and the salient lines in the image to be corrected, wherein a parameter p is set by a user and is usually 2-4, normalizing (dividing by the total sum) the p matching responses to make the sum of the p matching responses be 1, and subtracting the p normalized values by 1 to obtain the matching probability of each candidate matching line. In the present invention, the 1 is used to ensure that the smaller the matching response, the higher the probability.
The sampling correction optimizing unit 205 is configured to sample the matching line pairs obtained by the coarse matching unit 204 to obtain q matching line pairs, correct the ordinate of each significant line on the image to be corrected according to the significant line reference of the reference image of each matching line pair, search whether a significant line exists in the corresponding range on the reference image according to the correction result, update the confidence of the current set of sampling matching line pairs according to the search result, and determine the optimal sampling matching line pair through multiple iterations and according to the confidence.
After rough matching by the rough matching unit 204, each significant line of the reference image and the corresponding candidate matching line in the image to be corrected form a matching line pair, and the matching line pair obtained by the rough matching has different degrees of mismatching, but a large number of correct matching line pairs also exist, so that the reliability of each matching line pair needs to be further determined. In the embodiment of the invention, the matching line pair is sampled in a random sampling mode, and the reliability of the matching line pair is judged. A sampling is performed for a plurality of times until a suitable matching row pair is obtained, and the sampling correction optimizing unit 205 includes:
and the reference image salient line extraction module is used for randomly extracting q different salient lines from the salient lines in the reference image and sorting the salient lines from small to large according to the ordinate. The parameter q is set by the user, and its value is related to the actually extracted saliency line and the imaging quality. When the number of the remarkable rows is large and the imaging quality is good, the number of the remarkable rows can be set to be larger, and the number of the remarkable rows needs to be smaller, and the value range is usually 5-30.
A candidate matching line extraction module, configured to extract, for each extracted significant line, a matching line from p candidate matching lines thereof according to probability, where the candidate matching line extraction module is specifically operative to:
step 1: randomly generating 1 integer with the value range of 1,2, …, p, wherein the value corresponds to the selected candidate matching row;
step2, randomly generating 1 value with the value range of [0,1], and when the value is less than the matching probability of the candidate matching row, the current candidate matching row is a sampled matching row; otherwise, step1 is re-executed until a matching line is selected that meets the condition.
And the sequence consistency verification module is used for performing sequence consistency verification on the extraction result of the candidate matching line extraction module, specifically, for q significant lines in the reference image, if the ordinate of the ith significant line is smaller than the ordinate of the jth significant line (i, j belongs to {1,2, …, q } and i ≠ j), the matching line extracted from the image to be corrected must also satisfy the condition, otherwise, the current sampling is discarded and returned to the reference image significant line extraction module.
After the sampling matching pair operation is performed, q matching row pairs are obtained.
And the salient line correction module is used for correcting the ordinate of each salient line on the image to be corrected by taking the sampling line of the reference image as a reference according to the q sampled matching line pairs, searching whether the salient line exists in the corresponding range on the reference image according to the correction result, and updating the confidence coefficient of the sampling matching pair in the current group according to the search result.
Specifically, after each set of sample matching pairs is obtained, the confidence of the initial set of sample matching pairs is 0. Suppose q matching line pairs obtained by sampling, where the ordinate of the q salient lines on the reference image is noted as
Figure BDA0003223786840000171
The ordinate of the matching line corresponding thereto on the image W to be corrected is recorded as
Figure BDA0003223786840000172
The ordinate of each salient line on the image to be corrected is corrected with reference to the sampling line (salient line) of the reference image. With the jth significant row SW(j) For example, its original ordinate is yW(SW(j) Corrected ordinate is
Figure BDA0003223786840000173
a. When in use
Figure BDA0003223786840000174
When the temperature of the water is higher than the set temperature,
Figure BDA0003223786840000175
b. when in use
Figure BDA0003223786840000176
When k is 1,2, …, q-1
Figure BDA0003223786840000177
c. When in use
Figure BDA0003223786840000178
Time of flight
Figure BDA0003223786840000181
After obtaining the corrected ordinate
Figure BDA0003223786840000182
Then, corresponding range on the reference image
Figure BDA0003223786840000183
Figure BDA0003223786840000184
And if the inner search is carried out, increasing the current confidence coefficient by 1, wherein rtThe parameters set for the user are generally 2-5 rtThe smaller the value, the more stringent the determination, rtThe larger the value, the more relaxed the decision.
And the iteration execution module returns the reference image significant line extraction module to iterate to a set iteration number (for example, 100), and stores a group of matching line pairs with the highest confidence as an optimal sampling matching line pair, namely, selects a best matching group from all matching groups, wherein each group is q matching line pairs.
And the fine matching unit 206 is configured to allocate a matching line on the image to be corrected for each significant line on the reference image with reference to the optimal sampling matching line pair.
It is assumed that the sampling correction optimizing unit obtains an optimal sampling matching line pair by iterative optimization, wherein the ordinate of q optimal matching lines on the reference image is recorded as
Figure BDA0003223786840000185
The ordinate of the best matching line corresponding to the image W to be corrected is recorded as
Figure BDA0003223786840000186
And (4) carrying out matching again near each salient line position of the reference image to find the best corresponding line according to the corresponding position of the image to be corrected obtained by transformation. With the ith salient line S on the reference imageR(i) For example, its original ordinate is yR(SR(i) On the image to be corrected, the ordinate corresponding thereto is
Figure BDA0003223786840000187
a. When in use
Figure BDA0003223786840000188
When the temperature of the water is higher than the set temperature,
Figure BDA0003223786840000189
b. when in use
Figure BDA00032237868400001810
When k is 1,2, …, q-1
Figure BDA00032237868400001811
c. When in use
Figure BDA00032237868400001812
Time of flight
Figure BDA0003223786840000191
For salient lines S on the reference imageR(i) Obtaining the corresponding position of the image to be corrected
Figure BDA0003223786840000192
After, in the range
Figure BDA0003223786840000193
Interior re-match search and SR(i) Corresponding optimal matching row, let k be
Figure BDA0003223786840000194
Integer in range, match response value of
Figure BDA0003223786840000195
Figure BDA0003223786840000196
Using the line with the minimum response value as the salient line S of the reference imageR(i) In the corresponding line of the image to be corrected, it is marked as S 'for convenience of description'W(i) With ordinate yW(S′W(i))。
And a geometric correction unit 207, configured to transform the image to be corrected by using the obtained matching line to obtain an image aligned with the reference image, so as to implement geometric correction.
In the present invention, the line position u' of the u-th line of the aligned image C on the image W to be corrected is:
a. when u is<yR(SR(1) In the case of a hot press machine),
the proportional relationship between the ordinate positions of the 1 st and 2 nd significant lines is used to estimate the line u' on the image W to be corrected corresponding to it:
Figure BDA0003223786840000197
wherein, u' -yW(S′W(1) Denotes the u 'th to y' th lines on the image WW(S′W(1) Distance of rows); u-yR(SR(1) Represents the u-th to y-th lines on the image RR(SR(1) Distance of rows); y isW(S′W(1))-yW(S′W(2) Y on the image WW(S′W(1) Row y to yW(S′W(2) Distance of rows, yR(SR(1))-yR(SR(2) Y on the image R)R(SR(1) Row y to yR(SR(2) The distance of the rows, the present invention scales the distance of the adjacent rows to obtain the above equation.
b. When y isR(SR(i))≤u<yR(SR(i +1)), i is 1,2, … q-1
The proportional relation between the vertical coordinate positions of the ith and (i +1) th remarkable rows is used for estimating the row u' corresponding to the ith remarkable row on the image W to be corrected:
Figure BDA0003223786840000201
accordingly, u' -yW(S′W(i) Denotes the u 'th to y' th lines on the image WW(S′W(i) Distance of rows); u-yR(SR(i) Represents the u-th to y-th lines on the image RR(SR(i) Distance of rows); y isW(S′W(i))-yW(S′W(i +1)) represents the y-th on the image WW(S′W(i) Row y to yW(S′W(i +1)) distance of rows, yR(SR(i))-yR(SR(i +1)) represents the y-th on the image RR(SR(i) Row y to yR(SR(i +1)) row distance.
c. When u ≧ yR(SR(q)) in the presence of a catalyst,
the line u' corresponding to the q-1 th and the q-th significant line ordinate positions on the image W to be corrected is presumed by means of the proportional relationship:
Figure BDA0003223786840000202
wherein, u' -yW(S′W(q)) represents the u 'th to y' th lines on the image WW(S′W(q)) a distance of the rows; u-yR(SR(q)) represents the u-th to y-th lines on the image RR(SR(q)) a distance of the rows; y isW(S′W(q))-yW(S′W(q-1)) represents the y-th on the image WW(S′W(q)) line to the y-thW(S′W(q-1)) distance of rows, yR(SR(q))-yR(SR(q-1)) represents the y-th image on the image RR(SR(q)) line to the y-thR(SR(q-1)) distance of rows.
When the corresponding relationship is obtained, the line u' may be a small value, and the gray values of all data of the u-th line in the alignment image C need to be obtained through linear interpolation:
Figure BDA0003223786840000211
wherein v is the value of the abscissa of 1,2, …, w.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Modifications and variations can be made to the above-described embodiments by those skilled in the art without departing from the spirit and scope of the present invention. Therefore, the scope of the invention should be determined from the following claims.

Claims (10)

1. A line-scan cylinder geometric correction method based on salient line matching comprises the following steps:
step S1, acquiring an unfolded image of a perfect and undamaged cylindrical workpiece by using a line-scan camera, preprocessing the image to obtain an ideal image as a reference image, and performing line-scan acquisition on the unfolded image of the cylindrical workpiece to be corrected by using the line-scan camera as an image to be corrected;
step S2, respectively carrying out gradient projection on the reference image and the image to be corrected;
step S3, significant line selection is carried out on the reference image and the image to be corrected according to the gradient projection value;
step S4, for each significant line of the reference image, finding a candidate matching line corresponding to the significant line on the image to be corrected;
step S5, sampling the matching line pairs obtained in the step S4 to obtain q matching line pairs, correcting the vertical coordinate of each significant line on the image to be corrected according to the significant line reference of the reference image of each matching line pair, searching whether a significant line exists in the corresponding range on the reference image according to the correction result, updating the confidence coefficient of the sampling matching line pairs in the current group according to the search result, and determining the optimal sampling matching line pair through multiple iterations and according to the confidence coefficient;
step S6, with the optimal sampling matching line pair as reference, allocating a matching line on the image to be corrected for each salient line on the reference image;
and step S7, transforming the image to be corrected by using the obtained matching lines to obtain an image aligned with the reference image, thereby realizing geometric correction.
2. The line-scan cylinder geometry correction method based on saliency row matching as claimed in claim 1, characterized by: in step S1, if the side of the cylinder changes along the Y direction of the image, the width and height of the reference image are the same and different from those of the image to be corrected; if the side surface of the cylinder changes along the X direction of the image, the height of the reference image is the same as that of the image to be corrected, and the width of the reference image is different from that of the image to be corrected.
3. The line-scan cylinder geometry correction method based on saliency row matching as claimed in claim 2, characterized by: in step S1, the cylinder is expanded in the Y direction by default, and if the cylinder is expanded in the X direction, the expanded image is rotated by 90 degrees.
4. The line-scan cylinder geometry correction method based on saliency row matching as claimed in claim 1, characterized by: in step S3, the line with the gradient projection value greater than the predetermined threshold is regarded as the significant line.
5. The line-scan cylinder geometry correction method based on saliency row matching as claimed in claim 1, characterized by: in step S4, the ith salient line S in the reference image is determinedR(i) And the kth significant line S in the image to be correctedW(k) The match response M (i, k) is calculated as:
Figure FDA0003223786830000021
and selecting p significant lines with the minimum matching response as candidate matching lines according to the matching response values of the significant lines and the significant lines in the image to be corrected, wherein R (S)R(i) J) represents the i-th salient line S of the reference image RR(i) Gray value of upper j point, W (S)W(k) J denotes the kth significant line S of the image W to be correctedW(k) The gray value of the upper j point.
6. The method for geometric correction of line-scan cylinder based on saliency row matching as claimed in claim 1, wherein step S5 further comprises the steps of:
s500, randomly extracting q different salient lines from the salient lines in the reference image, and sorting the salient lines from small to large according to the vertical coordinate;
step S501, for each extracted significant line, extracting matching lines randomly from p candidate matching lines according to probability;
step S502, for q significant rows in the reference image, if the ordinate of the ith significant row is smaller than the ordinate of the jth significant row (i, j ∈ {1,2, …, q } and i ≠ j), then the matching row extracted from the image to be corrected must also satisfy the condition, otherwise, the current sampling is discarded and the step S500 is executed again;
step S503, according to q sampled matching line pairs, taking the sampling line of the reference image as a reference, correcting the ordinate of each significant line on the image to be corrected, searching whether significant lines exist in the corresponding range on the reference image according to the correction result, and updating the confidence coefficient of the sampling matching pair in the current group according to the search result;
and step S504, returning to the step S500, iterating to the set iteration times, and saving the matching line pair with the maximum confidence coefficient as the optimal sampling matching line pair.
7. The method for geometry correction of line-scan cylinder based on saliency row matching as claimed in claim 6, wherein in step S501, the following operations are performed:
step 1: randomly generating 1 integer between 1 and p, wherein the integer corresponds to the selected candidate matching row;
step2, randomly generating 1 value with the value range of [0,1], and when the value is less than the probability of the selected candidate matching line, the current candidate matching line is a sampled matching line; otherwise, step1 is re-executed until a matching line is selected that meets the condition.
8. The method for line-scan cylinder geometric correction based on saliency row matching as claimed in claim 7, wherein step S503 further comprises:
after each group of sampling matching pairs is obtained, the confidence coefficient of the matching pair is initially 0;
correcting the ordinate of each salient line on the image to be corrected according to the sampling behavior standard of the reference image;
and after the corrected vertical coordinate is obtained, searching whether a remarkable line exists in a corresponding range on the reference image, and if so, increasing the current confidence by 1.
9. The method as claimed in claim 8, wherein in step S6, each salient line position of the reference image is transformed to obtain the corresponding position of the image to be corrected, and the image to be corrected is re-matched to find the best corresponding line.
10. A line-scan cylinder geometry correction apparatus based on saliency row matching, comprising:
the image acquisition and processing unit is used for acquiring an unfolded image of a perfect and undamaged cylindrical workpiece by using the line-scan camera, preprocessing the image to obtain an ideal image as a reference image, and performing line-scan acquisition and unfolding on the cylindrical workpiece to be corrected by using the line-scan camera to obtain the unfolded image as an image to be corrected;
the gradient projection unit is used for respectively carrying out gradient projection on the reference image and the image to be corrected;
the salient line selection unit is used for carrying out salient line selection on the reference image and the image to be corrected according to the gradient projection value;
the rough matching unit is used for searching a candidate matching line corresponding to each salient line of the reference image on the image to be corrected;
the sampling correction optimizing unit is used for sampling the matching line pairs obtained by the coarse matching unit to obtain q matching line pairs, correcting the vertical coordinate of each significant line on the image to be corrected according to the significant behavior reference of the reference image of each matching line pair, searching whether a significant line exists in the corresponding range on the reference image according to the correction result, updating the confidence coefficient of the sampling matching line pairs in the current group according to the search result, and determining the optimal sampling matching line pair through multiple iterations and according to the confidence coefficient;
the fine matching unit is used for taking the optimal sampling matching line pair as a reference and distributing a matching line on the image to be corrected for each salient line on the reference image;
and the geometric correction unit is used for transforming the image to be corrected by using the obtained matching lines to obtain an image aligned with the reference image, so that geometric correction is realized.
CN202110965505.1A 2021-08-23 2021-08-23 Line scanning cylinder geometric correction method and device based on significant line matching Active CN113674174B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110965505.1A CN113674174B (en) 2021-08-23 2021-08-23 Line scanning cylinder geometric correction method and device based on significant line matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110965505.1A CN113674174B (en) 2021-08-23 2021-08-23 Line scanning cylinder geometric correction method and device based on significant line matching

Publications (2)

Publication Number Publication Date
CN113674174A true CN113674174A (en) 2021-11-19
CN113674174B CN113674174B (en) 2023-10-20

Family

ID=78544877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110965505.1A Active CN113674174B (en) 2021-08-23 2021-08-23 Line scanning cylinder geometric correction method and device based on significant line matching

Country Status (1)

Country Link
CN (1) CN113674174B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114782276A (en) * 2022-04-29 2022-07-22 电子科技大学 Resistivity imaging dislocation correction method based on adaptive gradient projection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102236798A (en) * 2011-08-01 2011-11-09 清华大学 Image matching method and device
CN103136760A (en) * 2013-03-26 2013-06-05 华北电力大学(保定) Multi sensor image matching method based on fast and daisy
CN103456022A (en) * 2013-09-24 2013-12-18 中国科学院自动化研究所 High-resolution remote sensing image feature matching method
CN103489191A (en) * 2013-09-24 2014-01-01 中国科学院自动化研究所 Method for detecting changes of remarkable target of remote sensing image
CN104077782A (en) * 2014-07-11 2014-10-01 中国科学院自动化研究所 Satellite-borne remote sense image matching method
CN107918927A (en) * 2017-11-30 2018-04-17 武汉理工大学 A kind of matching strategy fusion and the fast image splicing method of low error

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102236798A (en) * 2011-08-01 2011-11-09 清华大学 Image matching method and device
CN103136760A (en) * 2013-03-26 2013-06-05 华北电力大学(保定) Multi sensor image matching method based on fast and daisy
CN103456022A (en) * 2013-09-24 2013-12-18 中国科学院自动化研究所 High-resolution remote sensing image feature matching method
CN103489191A (en) * 2013-09-24 2014-01-01 中国科学院自动化研究所 Method for detecting changes of remarkable target of remote sensing image
CN104077782A (en) * 2014-07-11 2014-10-01 中国科学院自动化研究所 Satellite-borne remote sense image matching method
CN107918927A (en) * 2017-11-30 2018-04-17 武汉理工大学 A kind of matching strategy fusion and the fast image splicing method of low error

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨明东: "基于匹配策略融合的低误差快速图像拼接算法", 《计算机应用研究》, vol. 36, no. 4, pages 1222 - 1227 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114782276A (en) * 2022-04-29 2022-07-22 电子科技大学 Resistivity imaging dislocation correction method based on adaptive gradient projection
CN114782276B (en) * 2022-04-29 2023-04-11 电子科技大学 Resistivity imaging dislocation correction method based on adaptive gradient projection

Also Published As

Publication number Publication date
CN113674174B (en) 2023-10-20

Similar Documents

Publication Publication Date Title
CN110942458B (en) Temperature anomaly defect detection and positioning method and system
JP4125393B2 (en) Image fragment splicing device
CN111429494B (en) Biological vision-based point cloud high-precision automatic registration method
CN107240130B (en) Remote sensing image registration method, device and system
CN108009986B (en) Fragment splicing method and device based on edge information
CN107862319B (en) Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting
WO2014203687A1 (en) Image processing method, image processing device, and image processing program
CN108763575B (en) Image control point automatic selection method based on image control point database
WO2015035462A1 (en) Point feature based 2d-3d registration
TWI291543B (en) A method and a system for creating a reference image using unknown quality patterns
CN114897705A (en) Unmanned aerial vehicle remote sensing image splicing method based on feature optimization
US20150186422A1 (en) Image processing apparatus, image processing method, and image processing program
CN107845066B (en) Urban remote sensing image splicing method and device based on piecewise affine transformation model
CN107945120B (en) Sample block based rotation and scaling image restoration method
CN113674174A (en) Line scanning cylinder geometric correction method and device based on significant row matching
US7113652B2 (en) System and method for using normalized gray scale pattern find
CN113658080B (en) Linear scanning cylinder geometric correction method and device based on characteristic point matching
CN117635421A (en) Image stitching and fusion method and device
JPH0935062A (en) Image processing method
JP6919758B2 (en) Image collation device
US8655080B2 (en) Method and apparatus for identifying combinations of matching regions in images
CN111402281B (en) Book edge detection method and device
US7085433B2 (en) System and method for performing rotational and translational testing for a reference image used in a normalized gray scale pattern find system
Balletti et al. Image matching for historical maps comparison
Wang et al. Automatic corresponding control points selection for historical document image registration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant