CN115345780A - High-precision image splicing method and system based on grating ruler and storage medium - Google Patents

High-precision image splicing method and system based on grating ruler and storage medium Download PDF

Info

Publication number
CN115345780A
CN115345780A CN202210956388.7A CN202210956388A CN115345780A CN 115345780 A CN115345780 A CN 115345780A CN 202210956388 A CN202210956388 A CN 202210956388A CN 115345780 A CN115345780 A CN 115345780A
Authority
CN
China
Prior art keywords
image
calculating
picture
splicing
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210956388.7A
Other languages
Chinese (zh)
Inventor
余林彬
邵云峰
沈曦
曹桂平
董宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Eko Photoelectric Technology Co ltd
Original Assignee
Hefei Eko Photoelectric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Eko Photoelectric Technology Co ltd filed Critical Hefei Eko Photoelectric Technology Co ltd
Priority to CN202210956388.7A priority Critical patent/CN115345780A/en
Publication of CN115345780A publication Critical patent/CN115345780A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a high-precision image splicing method, a system and a storage medium based on a grating ruler, which comprises the following steps: calibrating the platform attribute according to the matching feature points, calculating a brightness compensation coefficient according to the matching feature points, shooting and recording the reading of a grating ruler, calculating the actual physical coordinates of each small image, calculating the actual physical coordinates of the images, calculating a rotation matrix during splicing of the images according to the feature point matching relationship among the small images, transforming the small images according to the rotation matrix, searching the small image pixel points corresponding to the pixel points on the images, performing weighted fusion on the points around the splicing seams to optimize the splicing seams, and calculating the gray value of the rest points by using a bilinear interpolation method to complete the splicing. The invention removes the dependence of the original technology on the circular calibration plate by introducing a characteristic point matching algorithm and using a method of replacing the circle center of the circular calibration plate with the matching characteristic points, and reduces factors which may influence the platform attribute calibration result; and optimizing a splicing result graph by adding a splicing optimization algorithm.

Description

High-precision image splicing method and system based on grating ruler and storage medium
Technical Field
The invention relates to the technical field of image processing, in particular to a high-precision image splicing method and system based on a grating ruler.
Background
Image stitching is a relatively mature application of image processing, which allows us to shoot the same object for multiple times to acquire pictures of all parts of the object, and obtains a complete picture of the object through image stitching, so that the problem that the field of view of a camera is smaller than that of shooting the object can be effectively solved.
The traditional image splicing method firstly extracts feature points of a shot image respectively, wherein the feature points are pixel points with fierce change of gray values of surrounding pixel points on the image and can well describe image features; then, the extracted feature points need to be matched through an algorithm, and common algorithms include Harris, SIFT, SURF and the like; and then calculating a projection matrix of one picture relative to the other picture according to the matching result, and transforming the pictures according to the projection matrix and then fusing the transformed pictures with the other picture to obtain a spliced image.
The traditional image stitching technology can meet the requirement of most scenes needing image stitching, but the projection transformation of pictures cannot ensure the accuracy of the formed pictures, and for the pictures with higher resolution, the precision error of ten-pixel level or even hundred-pixel level can be caused. Therefore, a sub-pixel level stitching technology based on a grating ruler is proposed in sub-pixel level stitching in automatic visual inspection of a large object surface with repeated features (3 rd of 2014 in optics), and the specific method is as follows:
firstly, calibrating the magnification of a camera and the included angles between the x and y axes of the moving platform on a grating ruler and the x and y axes of the camera by using a circular calibration plate, and taking the parameters as the platform parameters of the jigsaw platform; then, shooting pictures and recording the reading of the grating ruler during shooting, calculating the physical coordinate of each picture according to the reading and the platform parameters, and calculating the physical coordinate of the spliced pictures according to the picture physical coordinates; then, each pixel point on the spliced picture is subjected to the following operations:
(1) Calculating physical coordinates; (2) Calculating the small picture on which the pixel point is positioned according to the physical coordinates of the pixel point and the physical coordinates of the formed picture; (3) Comparing the physical coordinates of the small images, and mapping the pixel points to the small images; and (4) calculating the gray value of the pixel point by using a bilinear interpolation method.
Thus, the spliced picture is obtained. The technology is very dependent on the calibration precision of the platform attribute, which directly determines the splicing effect of the technology; the technology does not consider the rotation problem which may occur in the moving process of the shot object, and the applicability is poor; in addition, the technology does not carry out seam optimization, and obvious seam influence on visual perception can be caused when the picture is spliced in a scene with uneven illumination; meanwhile, when the high-resolution images are spliced, pixel dislocation may occur around the spliced seam, and at the moment, the finished images spliced by the technology have obvious dislocation feeling. Therefore, in order to solve the problems, the invention provides an improved high-precision image splicing technology based on a grating ruler on the basis of the technology.
Specifically, the prior art has the following defects:
1. the rotation problem possibly occurring in the moving process of the shot object is not considered, and the applicability is not high;
2. the method for calibrating the platform attribute by using the circular calibration plate has the advantages of complex operation process, higher operation precision requirement, higher precision dependence on the circular calibration plate, and inaccurate calibrated platform attribute if the precision of the circular calibration plate is low;
3. and the seam optimization is not considered, so that the obtained imaging visual perception effect is poor.
Disclosure of Invention
The invention provides a high-precision image splicing method based on a grating ruler, which can at least solve one of the technical problems.
In order to achieve the purpose, the invention adopts the following technical scheme:
a high-precision image splicing method based on a grating ruler comprises the following steps:
carrying out attribute calibration on the mobile platform with the grating ruler by using a method of matching characteristic points;
acquiring multiple frames of images to be spliced and grating ruler readings corresponding to the images;
calculating the actual physical coordinates of each small picture, and calculating the actual physical coordinates of the pictures;
corresponding pixel points on the formed image to pixel points on each small image one by one;
and calculating the gray value of each pixel point on the image through a bilinear interpolation method to finish splicing.
Furthermore, the gray value of each pixel point on the image is calculated through a bilinear interpolation method, and the seam can be optimized through a seam optimization algorithm.
Further, optimizing the seam using a seam optimization algorithm includes:
finding a pair of corresponding pixel points on the small images on the two sides of the seam;
calculating the gray value of the pair of pixel points by a bilinear interpolation method;
and performing weighted fusion according to the proportion of the physical distance between each point in the pair of pixel points and the joint, wherein the weighted setting is that the point closer to the joint is higher in weight.
Further, the method also comprises the following steps of rotating the small graph:
calculating a rotation matrix when all the small images are spliced according to the feature point matching relation among the small images;
transforming each small graph according to the rotation matrix;
further, the attribute calibration of the mobile platform with the grating ruler by using the method of matching feature points comprises the following steps:
the calibrated platform attributes include the magnification p in the x-axis direction of the camera x Magnification p of the y-axis of the camera y The angle theta between the x-axis of the platform and the x-axis of the camera x The included angle theta between the y-axis of the platform and the y-axis of the camera y
The camera shoots a picture I, the shot object respectively moves for a distance along the x axis and the y axis of the platform and then shoots a picture II, and the moving distance of the picture is recorded;
selecting and matching feature points of the first picture and the second picture;
and calculating the platform attribute according to the matched feature points and the image moving distance.
Further, the selecting and matching of the feature points of the first picture and the second picture further comprises:
performing feature point matching by using one of Harris, SIFT and SURF algorithms;
respectively using the first picture and the second picture as reference pictures to carry out feature point matching twice;
combining the results of the two matching and screening by using an RANSAC algorithm;
and screening out the unnecessary characteristic points, and using the remaining characteristic point pairs as matched characteristic point pairs.
Further, after the gray value of each pixel point on the graph is calculated by the bilinear interpolation method, the method further comprises the following steps:
calculating a brightness compensation coefficient according to the matching feature points;
and performing brightness compensation on the small graph needing brightness compensation according to the compensation coefficient.
Further, the calculating the brightness compensation coefficient according to the matched feature point includes:
selecting a range which does not contain other feature points around any pair of matched feature points in the first picture and the second picture;
calculating the mean value of the gray values of the pixel points in the range on the two pictures;
dividing or dividing the mean value of the gray values on the picture I by the mean value of the gray values on the picture II to obtain a result which is used as a brightness compensation coefficient of the characteristic point;
performing the operation on all the matched feature points on the two pictures to obtain the brightness compensation coefficients of all the feature points;
and taking the average value of the brightness compensation coefficients of all the characteristic points as the brightness compensation coefficient of the picture needing brightness compensation.
Further, the actual physical coordinates of each small graph are calculated by using the following formula:
(x A0 ,y A0 ),(x A0 +W*p x ,y A0 ),(x A0 +W*p x ,y A0 +H*p y ),(x A0 ,y A0 +H*p y ) (1)
in formula (1), W and H are the resolution of the picture, p x Is the magnification, p, of the camera in the x-axis direction y Is the magnification of the y-axis direction of the camera (x) A0 ,y A0 ) Is the physical coordinate of the vertex of the lower left corner of the small graph and has (x) A0 ,y A0 )=(x 0 ·cosθ x +y 0 ·sinθ y ,x 0 ·sinθ x +y 0 ·cosθ y ) Wherein (x) 0 ,y 0 ) For recorded grating scale readings, theta x Is the angle theta between the x-axis of the platform and the x-axis of the camera y Is the angle between the y-axis of the platform and the y-axis of the camera.
Further, the one-to-one correspondence between the pixel points on the map and the pixel points on each small map includes:
corresponding the pixel coordinates of the pixel points on the image to the actual physical coordinates;
and obtaining the pixel coordinate of the pixel point on the corresponding small image according to the physical position of the pixel point on the image.
On the other hand, the invention also discloses a high-precision image splicing system based on the grating ruler, which is characterized by comprising the following modules:
the attribute calibration module is used for performing attribute calibration on the mobile platform with the grating ruler and storing corresponding platform attributes;
and the image splicing module is used for calculating the actual physical coordinates of each small image, calculating the actual physical coordinates of the image, corresponding the pixel points on the image to the pixel points on each small image one by one, calculating the gray value of each pixel point on the image by a bilinear interpolation method, and completing splicing.
Further, the high-precision image stitching system based on the grating ruler further comprises:
the brightness compensation module is used for calculating a brightness compensation coefficient according to the matching feature points and performing brightness compensation on the small graph needing the brightness compensation according to the compensation coefficient;
the seam optimizing module is used for optimizing the seam by using a seam optimizing algorithm;
and the rotation correction module is used for calculating a rotation matrix when all the small images are spliced according to the characteristic point matching relation among the small images and transforming all the small images according to the rotation matrix.
In still another aspect, the present invention further discloses a computer-readable storage medium, which is characterized in that the computer-readable storage medium stores program data that can be executed to implement the method for splicing images based on a linear scale.
According to the technical scheme, due to the traditional image splicing mode based on feature point matching, although the splicing quality is good and the universality is strong, the accuracy of the spliced image cannot be guaranteed, the splicing speed is low, and the method cannot be applied to the use scene needing to guarantee the imaging accuracy. Aiming at the problem, the invention provides a high-precision image splicing method and system based on a grating ruler and a reading and storing medium thereof, which ensure that the image splicing quality is ensured, and the image splicing precision can reach a sub-pixel level.
The invention provides a high-precision image splicing method and system based on a grating ruler and a reading and storing medium thereof, and solves the problem that a shot object rotates in actual use by adding calculation and transformation of a rotation matrix in the splicing process; by introducing a characteristic point matching algorithm, the method of replacing the circle center of the circular calibration plate with the matching characteristic points is used for removing the dependence of the original technology on the circular calibration plate and reducing factors which may influence the platform attribute calibration result; and optimizing a splicing result graph by adding a splicing optimization algorithm. Compared with the prior art, the method has the advantages of stronger applicability, simpler operation process, fewer result influence factors and better visual effect of the spliced result graph.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention.
As shown in fig. 1, the method for splicing high-precision images based on a grating scale in this embodiment includes:
carrying out attribute calibration on the mobile platform with the grating ruler by using a method of matching feature points;
acquiring reading numbers of a plurality of frames of images to be spliced and a grating ruler corresponding to each frame of image;
calculating the actual physical coordinates of each small picture, and calculating the actual physical coordinates of the pictures;
corresponding the pixel points on the image to the pixel points on each small image one by one;
and calculating the gray value of each pixel point on the image through a bilinear interpolation method to finish splicing.
The method comprises the following specific steps:
an object to be shot is placed on a moving platform with a grating ruler, a camera is placed right above the platform, the camera is kept static during shooting, and only the platform is moved. The motion directions of the platform are two mutually perpendicular and are respectively marked as an x axis and a y axis, theoretically, the x axis and the y axis of the platform motion should be kept parallel to the x axis and the y axis of the camera, but actually, due to the influence of installation errors, the x axis and the y axis are not completely parallel to each other but form a certain included angle with the x axis and the y axis of the camera, and the values of the two included angles are calibrated in the platform attribute calibration.
The mobile platform attributes are calibrated according to the matched feature points by using an SURF algorithm to select and match the feature points, and the number of the calibrated platform attributes is four, namely the magnification factor p in the x-axis direction of the camera x Magnification p of the y-axis of the camera y The included angle theta between the x-axis of the platform and the x-axis of the camera x The included angle theta between the y-axis of the platform and the y-axis of the camera y . During calibration, firstly taking a picture, recording the picture as a first picture, recording the reading of the current grating ruler, then respectively taking another picture after moving for a certain distance along the x-axis direction of the platform and the y-axis direction of the platform, recording the moving distance as a second picture; the two pictures are used for matching the feature points, the picture I is used as a reference picture for matching, and then the picture I is usedAnd matching the second picture as a reference picture, combining the results of the two matching, and then screening by using a RANSAC algorithm, wherein the screening process is to screen out and eliminate unnecessary matching point pairs, and the remaining points are used as matching feature points, so that the aim of taking out as many matching point pairs as possible is fulfilled, and finally, the platform attribute is calculated according to the matching point pairs and the recorded moving distance. The method specifically comprises the following steps:
and (3) calibrating the amplification factor:
and (3) calibrating by adopting the matched characteristic points, taking a picture at the position 1 (x 1, y 1), moving to the position 2 (x 2, y 2) along any axis, taking another picture, recording the readings of the grating ruler at the two positions, and calculating the moving distance D in a unit of mum. Taking the x-axis as an example,
Figure BDA0003791526420000081
wherein d is x The pixel distance on the x-axis between a pair of matching feature points is shown, and the calculation on the y-axis is the same.
And (3) calibrating an included angle:
taking the x axis as an example, two pictures shot only when moving in the x axis direction are selected, an included angle is calculated according to the coordinate vectors of a pair of matched feature points,
Figure BDA0003791526420000082
wherein Δ x is the difference between coordinates x of two feature points, Δ y is the difference between coordinates y of two feature points, the coordinates described here are pixel coordinates, and the calculation of y axis is the same.
Because the platform attributes are inherent attributes, the same platform can be calibrated accurately at one time.
And according to the matched feature point pairs obtained in the last step of calibrating the platform attributes and the corresponding pictures, calculating a brightness compensation coefficient when the picture is spliced, and selecting a certain range around the matched feature points, wherein the range is selected according to the fact that the selected range does not contain another feature point. And calculating the mean value of the gray values of the pixels in the range of the first picture and the second picture, dividing or dividing the mean value of the gray values of the first picture by the mean value of the gray values of the second picture as the brightness compensation coefficient of the pixels in the range, performing the operation on each pair of matched characteristic points, and taking the mean value of the obtained brightness compensation coefficients as the brightness compensation coefficient used in final splicing.
The method for calculating the actual physical coordinates of each small graph is as follows: the view finding range of the camera can be considered as a rectangular range, the physical coordinates of the vertex of the lower left corner of the rectangular range of the small image are calculated through the recorded grating ruler reading and the platform attribute, and the recorded grating ruler reading is set as (x) 0 ,y 0 ) The physical coordinate of the vertex of the lower left corner is (x) 0 ·cosθ x +y 0 ·sinθ y ,x 0 ·sinθ x +y 0 ·cosθ y ) Hereinafter, this coordinate is referred to as (x) for convenience of description A0 ,y A0 ) When the resolution of a picture is known, the physical width of the picture is W, and the physical height of the picture is H, then the actual physical coordinates of a small picture are in the form (x) A0 ,y A0 ),(x A0 +W*p x ,y A0 ),(x A0 +W*p x ,y A0 +H*p y ),(x A0 ,y A0 +H*p y )。
And after the actual physical coordinates of each small graph are calculated, the actual physical coordinates of the spliced graphs can be calculated according to the coordinates of the small graphs. And before splicing and fusion, performing brightness compensation on the small images needing brightness compensation according to the brightness compensation coefficient calculated previously. Then, for any pixel point on the graph, the pixel coordinate of the pixel point can be correspondingly converted into an actual physical coordinate, which small graph the pixel point belongs to can be obtained according to the physical position of the pixel point, then the pixel point is mapped to the corresponding small graph, and the pixel coordinate corresponding to the small graph is obtained, the pixel coordinate is usually in a sub-pixel level, so that the gray value of the pixel point needs to be calculated in a bilinear interpolation mode, and the gray value of any pixel point on the graph is obtained. Carrying out weighted fusion on points around the seam to optimize the seam, which comprises the following specific steps: finding out pixel points on corresponding small images on two sides of the joint, calculating the gray values of the two points through a bilinear interpolation method, and performing weighted fusion according to the proportion of the physical distance between the two points and the joint, wherein the closer the point is to the joint, the higher the weight is.
Taking 6 images to be spliced as an example, the actual physical coordinates of the spliced images are calculated according to the coordinates of the small images:
1. and (3) performing conversion of a physical rectangular coordinate system:
(1) Shooting six pictures to be spliced, and recording grating ruler readings (x 1, y 1), (x 2, y 2) and (x 6, y 6) in shooting from bottom to top from left to right;
(2) The coordinates (x 0, y 0) of the shot image of any picture are subjected to coordinate transformation of a physical rectangular coordinate system, and the result is (x) 0 ·cosθ x +y 0 ·sinθ y ,x 0 ·sinθ x +y 0 ·cosθ y ) Hereinafter, for convenience of description, the coordinate is denoted as (x) A0 ,y A0 );
(3) The horizontal and vertical resolutions W and H of the picture are known, and in combination with the magnification, the physical width and height of the picture can be found: w is a group of 0 =W×P,H 0 =H×P;
(4) The four-corner coordinates of any picture are:
(x A0 ,y A0 ),(x A0 +W*p x ,y A0 ),(x A0 +W*p x ,y A0 +H*p y ),(x A0 ,y A0 +H*p y )
2. calculating into a graph:
(1) Calculating the four-corner coordinates of the graph: four values need to be determined for calculating the four-corner coordinates of the graph:
minimum value x of graph x min X for three pictures taken on the left A Maximum value of (d);
maximum value x of graph x max X for three pictures taken on the right A A minimum value of + W;
minimum value y of graph y min Y for two pictures taken below A Maximum value of (d);
maximum value y of graph y max Y for two pictures taken above A A minimum value of + H.
Thus, the four corner coordinates of the graph should be:
(x min ,y min ),(x max ,y min ),(x max ,y max ),(x min ,y max )
3. calculating the pixel value of each pixel point on the graph:
pixel size of the figure is w 0 And h 0
Figure BDA0003791526420000111
For any pixel point (i, j) on the graph, calculating the coordinate of a physical rectangular coordinate system as
(i·P+x min ,j·P+y min ) And comparing the image with a quadrant threshold value to determine the image to which the image belongs, wherein the quadrant threshold value is calculated as follows:
setting a quadrant threshold in the horizontal direction
Figure BDA0003791526420000112
Setting two quadrant thresholds in the vertical direction
Figure BDA0003791526420000113
Assuming that the image to which the pixel belongs is image 3, calculating by using bilinear interpolation method
Figure BDA0003791526420000114
The pixel value at this point is defined as the pixel value at point (i, j) on the graph.
The above is to calculate the number of the pictures to be spliced as 6, and when the number of the pictures to be spliced is other, the calculation is also performed according to the same calculation method.
Ideally, high-precision image stitching can be completed through the above process, but in actual use, the influence of shaking of a shot object on a stitching result needs to be considered. In practice, slight jitter may occur during the motion of the object being photographed with the platform, which is generally very subtle, but for high resolution cameras these jitter may introduce pixel level shifts that affect the quality of the mosaic. Therefore, before splicing, feature point matching is carried out on each small image, a rotation matrix during splicing of each small image is calculated according to a matching result, the calculated rotation matrix does not contain a scaling factor, only angular rotation is carried out on the image, and the splicing precision is not influenced. After each small picture is transformed according to the rotation matrix, subsequent mapping operation of the picture-forming pixel points is performed, so that the adverse effect of shaking of a shot object on the quality of the picture mosaic can be effectively counteracted.
In summary, the key point and the point to be protected of the present invention lie in the improved high-precision image stitching process based on the grating ruler, including the processes of calibration of the platform inherent attribute by matching the feature points, calculation of the rotation matrix, rotation transformation of the original image, brightness compensation during stitching and seam optimization.
The method can solve the problem that the shot object has a rotating use scene, the operation process is simpler, the noise resistance is stronger, and the visual effect of the spliced result graph is better.
Correspondingly, the embodiment of the invention also discloses a high-precision image splicing system based on the grating ruler, which comprises the following modules:
the attribute calibration module is used for calibrating attributes of the mobile platform with the grating ruler and storing corresponding platform attributes;
and the image splicing module is used for calculating the actual physical coordinates of each small image, calculating the actual physical coordinates of the formed image, corresponding the pixel points on the formed image to the pixel points on each small image one by one, calculating the gray value of each pixel point on the formed image through a bilinear interpolation method, and completing splicing.
Wherein, high accuracy image mosaic system based on grating chi still includes:
the brightness compensation module is used for calculating a brightness compensation coefficient according to the matching feature points and performing brightness compensation on the small graph needing the brightness compensation according to the compensation coefficient;
the seam optimizing module is used for optimizing the seam by using a seam optimizing algorithm;
and the rotation correction module is used for calculating a rotation matrix when all the small images are spliced according to the characteristic point matching relation among the small images and transforming all the small images according to the rotation matrix.
In yet another aspect, the present invention also discloses a computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of any of the methods described above.
In yet another aspect, the present invention also discloses a computer device comprising a memory and a processor, the memory storing a computer program, the computer program, when executed by the processor, causing the processor to perform the steps of any of the methods as described above.
In a further embodiment provided by the present application, there is also provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the steps of any of the methods of the above embodiments.
It is understood that the system provided by the embodiment of the present invention corresponds to the method provided by the embodiment of the present invention, and the explanation, the example and the beneficial effects of the related contents can refer to the corresponding parts in the method.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by a computer program, which may be stored in a non-volatile computer readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above examples are only intended to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (12)

1. A high-precision image splicing method based on a grating ruler is characterized by comprising the following steps:
carrying out attribute calibration on the mobile platform with the grating ruler by using a method of matching feature points;
acquiring reading numbers of a plurality of frames of images to be spliced and a grating ruler corresponding to each frame of image;
calculating the actual physical coordinates of each small picture, and calculating the actual physical coordinates of the pictures;
corresponding pixel points on the formed image to pixel points on each small image one by one;
and calculating the gray value of each pixel point on the image through a bilinear interpolation method to finish splicing.
2. The grating ruler-based high-precision image stitching method according to claim 1, wherein after the gray value of each pixel point on the image is calculated by a bilinear interpolation method, the method further comprises optimizing the stitching by using a stitching optimization algorithm, and specifically comprises the following steps:
finding a pair of corresponding pixel points on the small images on the two sides of the abutted seam;
calculating the gray value of the pair of pixel points by a bilinear interpolation method;
and performing weighted fusion according to the proportion of the physical distance between each point in the pair of pixel points and the joint, wherein the weighted setting is that the point closer to the joint is higher in weight.
3. The method for splicing high-precision images based on the grating ruler as claimed in claim 1 or 2, wherein after calculating the actual physical coordinates of each small image and calculating the actual physical coordinates of the image, the method further comprises:
calculating a rotation matrix when all the small images are spliced according to the feature point matching relation among the small images;
and transforming each small graph according to the rotation matrix.
4. The method for splicing high-precision images based on a grating scale as claimed in claim 1, wherein the method for performing attribute calibration on the moving platform with the grating scale by using the matching feature points comprises:
the calibrated platform attributes include the magnification p in the x-axis direction of the camera x Magnification p of the y-axis of the camera y The included angle theta between the x-axis of the platform and the x-axis of the camera x The included angle theta between the y axis of the platform and the y axis of the camera y
The camera shoots a picture I, a shot object respectively moves a distance along the x axis and the y axis of the platform and then shoots a picture II, and the moving distance of the shot object is recorded;
selecting and matching feature points of the first picture and the second picture;
and calculating the platform attribute according to the matched feature points and the image moving distance.
5. The method for splicing high-precision images based on the grating ruler as claimed in claim 4, wherein the selecting and matching of the feature points of the first picture and the second picture comprises:
performing feature point matching by using one of Harris, SIFT and SURF algorithms;
respectively using the first picture and the second picture as reference pictures to carry out feature point matching twice;
combining the results of the two matching and screening by using an RANSAC algorithm;
and screening out unnecessary characteristic points, and taking the remaining characteristic point pairs as matching characteristic points.
6. The method for splicing high-precision images based on a grating ruler as claimed in claim 5, wherein after the gray value of each pixel point on the image is calculated by a bilinear interpolation method, the method further comprises:
calculating a brightness compensation coefficient according to the matching feature points;
and performing brightness compensation on the small image needing brightness compensation according to the compensation coefficient.
7. The method for splicing high-precision images based on a grating ruler as claimed in claim 6, wherein the calculating the brightness compensation coefficient according to the matched feature points comprises:
selecting a range which does not contain other feature points around any pair of matched feature points in the first picture and the second picture;
calculating the mean value of the gray values of the pixel points in the range on the two pictures;
dividing or dividing the mean value of the gray values on the picture I by the mean value of the gray values on the picture II to obtain a result which is used as a brightness compensation coefficient of the characteristic point;
performing the operation on all the matched feature points on the two pictures to obtain the brightness compensation coefficients of all the feature points;
and taking the average value of the brightness compensation coefficients of all the characteristic points as the brightness compensation coefficient of the picture needing brightness compensation.
8. The method for splicing high-precision images based on the grating ruler as claimed in claim 1, wherein the actual physical coordinates of each small image are calculated by using the following formula:
(x A0 ,y A0 ),(x A0 +W*p x ,y A0 ),(x A0 +W*p x ,y A0 +H*p y ),(x A0 ,y A0 +H*p y ) (1)
in equation (1), W and H are the horizontal and vertical resolutions of the picture, p, respectively x Is the magnification, p, of the camera in the x-axis direction y Is the magnification factor of the y-axis direction of the camera (x) A0 ,y A0 ) Is the physical coordinate of the vertex of the lower left corner of the small graph and has (x) A0 ,y A0 )=(x 0 ·cosθ x +y 0 ·sinθ y ,x 0 ·sinθ x +y 0 ·cosθ y ) Wherein (x) 0 ,y 0 ) For recorded grating scale readings, theta x Is the angle theta between the x-axis of the platform and the x-axis of the camera y Is the angle between the y-axis of the platform and the y-axis of the camera.
9. The method for splicing high-precision images based on the grating ruler as claimed in claim 1, wherein the one-to-one correspondence between the pixel points on the image and the pixel points on each small image comprises:
corresponding the pixel coordinates of the pixel points on the map to the actual physical coordinates;
and obtaining the pixel coordinates of the pixel points on the corresponding small images according to the physical positions of the pixel points of the formed images.
10. A high-precision image splicing system based on a grating ruler is characterized by comprising the following modules:
the attribute calibration module is used for executing the step of performing attribute calibration on the mobile platform with the grating ruler by using the method for matching the feature points as claimed in claim 4, and storing corresponding platform attributes;
and the image splicing module is used for calculating the actual physical coordinates of each small image, calculating the actual physical coordinates of the image, corresponding the pixel points on the image to the pixel points on each small image one by one, calculating the gray value of each pixel point on the image by a bilinear interpolation method, and completing splicing.
11. The system for stitching high-precision images based on a linear scale as claimed in claim 10, further comprising:
the brightness compensation module is used for calculating a brightness compensation coefficient according to the matching feature points and performing brightness compensation on the small graph needing the brightness compensation according to the compensation coefficient;
a seam optimization module for performing the seam optimization step using the seam optimization algorithm of claim 2;
and the rotation correction module is used for calculating a rotation matrix when all the small images are spliced according to the characteristic point matching relation among the small images and transforming all the small images according to the rotation matrix.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium stores program data which can be executed to implement the raster ruler-based high-precision image stitching method according to any one of claims 1 to 9.
CN202210956388.7A 2022-08-10 2022-08-10 High-precision image splicing method and system based on grating ruler and storage medium Pending CN115345780A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210956388.7A CN115345780A (en) 2022-08-10 2022-08-10 High-precision image splicing method and system based on grating ruler and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210956388.7A CN115345780A (en) 2022-08-10 2022-08-10 High-precision image splicing method and system based on grating ruler and storage medium

Publications (1)

Publication Number Publication Date
CN115345780A true CN115345780A (en) 2022-11-15

Family

ID=83951335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210956388.7A Pending CN115345780A (en) 2022-08-10 2022-08-10 High-precision image splicing method and system based on grating ruler and storage medium

Country Status (1)

Country Link
CN (1) CN115345780A (en)

Similar Documents

Publication Publication Date Title
CN106875339B (en) Fisheye image splicing method based on strip-shaped calibration plate
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CN109767474B (en) Multi-view camera calibration method and device and storage medium
JP5437311B2 (en) Image correction method, image correction system, angle estimation method, and angle estimation device
JP5075757B2 (en) Image processing apparatus, image processing program, image processing method, and electronic apparatus
CN110782394A (en) Panoramic video rapid splicing method and system
WO2020010945A1 (en) Image processing method and apparatus, electronic device and computer-readable storage medium
CN107665483B (en) Calibration-free convenient monocular head fisheye image distortion correction method
CN113920206B (en) Calibration method of perspective tilt-shift camera
CN111340737B (en) Image correction method, device and electronic system
CN109859137B (en) Wide-angle camera irregular distortion global correction method
CN110033461B (en) Mobile phone anti-shake function evaluation method based on target displacement estimation
CN110738608B (en) Plane image correction method and system
CN108269234B (en) Panoramic camera lens attitude estimation method and panoramic camera
WO2019232793A1 (en) Two-camera calibration method, electronic device and computer-readable storage medium
JP5731911B2 (en) Image processing apparatus and control method thereof
TWI517094B (en) Image calibration method and image calibration circuit
US20130120538A1 (en) Stereo camera module
JP2008298589A (en) Device and method for detecting positions
CN107492080B (en) Calibration-free convenient monocular head image radial distortion correction method
CN111126418A (en) Oblique image matching method based on planar perspective projection
TWI602420B (en) Stereo vision image calibration method and related image capturign device
CN115345780A (en) High-precision image splicing method and system based on grating ruler and storage medium
CN112308778B (en) Method and terminal for assisting panoramic camera splicing by utilizing spatial three-dimensional information
CN112419383B (en) Depth map generation method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination