WO2020206903A1 - Procédé et dispositif de mise en correspondance d'images et support de mémoire lisible par ordinateur - Google Patents

Procédé et dispositif de mise en correspondance d'images et support de mémoire lisible par ordinateur Download PDF

Info

Publication number
WO2020206903A1
WO2020206903A1 PCT/CN2019/102187 CN2019102187W WO2020206903A1 WO 2020206903 A1 WO2020206903 A1 WO 2020206903A1 CN 2019102187 W CN2019102187 W CN 2019102187W WO 2020206903 A1 WO2020206903 A1 WO 2020206903A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
matching
image matching
epipolar
images
Prior art date
Application number
PCT/CN2019/102187
Other languages
English (en)
Chinese (zh)
Inventor
王义文
王健宗
Original Assignee
平安科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安科技(深圳)有限公司 filed Critical 平安科技(深圳)有限公司
Publication of WO2020206903A1 publication Critical patent/WO2020206903A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • This application relates to the field of computer technology, and in particular to an image matching method, device and computer-readable storage medium.
  • Image matching refers to the process of identifying points with the same name between two or more images through a certain matching algorithm. It is an important preliminary step in image fusion, target recognition, target change detection, computer vision and other problems. It has a wide range of applications in many fields such as remote sensing, digital photogrammetry, computer vision, cartography and military applications. At present, the most effective and effective method for image matching is to judge the different points of the image by visual inspection; secondly, according to the principle of the essence of the image, which is the principle of pixels, compare the gray values of all pixels in the target area; or find the target image based on the principle of template matching The same or the most similar position to the sub-image in the search image, etc.
  • the present application provides an image matching method, device, and computer-readable storage medium, the main purpose of which is to provide a new image matching method applied to dense stereo scenes under aerial photography to improve image matching efficiency.
  • an image matching method provided by this application includes:
  • the image imaging map is generated, and the first image matching is performed on the image imaging map using the scale-invariant feature transformation method to generate the first image matching set;
  • a dense matching of all pixels between images is established, a third image matching set is generated, and three-dimensional reconstruction is performed to obtain a reconstructed scene image.
  • the present application also provides an image matching device.
  • the device includes a memory and a processor.
  • the memory stores an image matching program that can run on the processor.
  • the image matching program When executed by the processor, the steps of the image matching method described above are realized.
  • the present application also provides a computer-readable storage medium with an image matching program stored on the computer-readable storage medium, and the image matching program can be executed by one or more processors to achieve The steps of the image matching method described above.
  • the image matching method, device, and computer-readable storage medium proposed in this application generate an image imaging map based on the scene image shot by an aerial camera, and use the scale-invariant feature transformation method to perform the initial image matching on the image imaging map to generate the initial image matching set, Based on the first image matching set, generate epipolar images and calculate the degree of overlap between the epipolar images, complete the second image matching, and generate a second image matching set based on the second image matching set Establish dense matching of all pixels between images, generate a third image matching set, and perform 3D reconstruction to obtain a reconstructed scene image.
  • This application improves the efficiency of image matching, and can perform three-dimensional reconstruction of images of dense scenes under aerial photography, so as to more effectively help users to conduct analysis and research.
  • FIG. 1 is a schematic flowchart of an image matching method provided by an embodiment of this application.
  • FIG. 2 is a schematic diagram of the internal structure of an image matching device provided by an embodiment of the application.
  • Fig. 3 is a schematic diagram of modules of an image matching program in an image matching device provided by an embodiment of the application.
  • This application provides an image matching method.
  • FIG. 1 it is a schematic flowchart of an image matching method provided by an embodiment of this application.
  • the method can be executed by a device, and the device can be implemented by software and/or hardware.
  • the image matching method includes:
  • the scene images taken by aerial equipment such as drones, helicopters and other flight control systems, have a large number of images and a wide viewing angle, especially buildings, which are dense and dense. Therefore, this application first restores the overlapping image sets to their respective positions, and reconstructs the imaging map of the object.
  • the model formula for object imaging under aerial photography is used to recover overlapping scene images.
  • Position generate image imaging map.
  • model formula of the object imaging is as follows:
  • s is the scale factor
  • m is the coordinate of the image point
  • M is the coordinate of the object point (the object point and the image point are the object position and the image position in the optical imaging respectively)
  • K is the parameter matrix in the aerial photography tool, It is composed of focal length and principal point coordinates
  • R is a rotation matrix, which can be converted to approximate values according to the yaw, pitch, and roll recorded by the aerial tool’s system
  • C is the projection center
  • the position vector can be approximated directly from the longitude, latitude, and altitude recorded by the GPS of the aerial photography tool
  • I is the third-order unit matrix.
  • the imaging map of n images can be obtained.
  • G n (V n , E n), wherein, referred to as a vertex set V n, En is called an edge set (a graph is a widely used data structure.
  • the nodes in the graph are called vertices.
  • the relationship between two vertices can be represented by a pair, called an edge. If the graph represents an edge Even pairs are ordered, then the graph is called a directed graph, if the pairs representing edges are disordered, then it is called an undirected graph).
  • No set of edges E in the drawing represents the number nE E n-side is, for each table represents one image, on behalf of the E nE n-th image for subsequent image matching process performed between only one image pair nE . If the relationship between images is not considered and the exhaustive traversal strategy is used for image matching, the total number of matches is Usually n*(n-1)/2 will be much larger than nE.
  • the method of constructing the image relationship undirected graph limits the scope of image matching, can avoid blind image matching, reduce the total image matching calculation complexity from O(n 2 ) to O(n), and improve the matching calculation efficiency ; At the same time, it can effectively eliminate the interference of unrelated image pairs, fundamentally avoid mismatches caused by non-overlapping images, and improve the accuracy of matching and the robustness of reconstruction.
  • the scale-invariant feature transform (SIFT) algorithm is used for image matching.
  • SIFT scale-invariant feature transform
  • image matching if there are few matching points in the two images I i and I j , which are smaller than the threshold N 1 , it means that the overlap is small or the correlation is weak, and (I i , I j ) is removed from the set E. If the number of matching points in the two images I i and I j is greater than the threshold N 1 , then the imaging image pair is retained to generate There are n 1 E image pairs in total, and the first image matching set E 1 is generated.
  • S20 Based on the first image matching set, generate an epipolar image and calculate the degree of overlap between the epipolar images, complete the second image matching, and generate a second image matching set.
  • the epipolar image is a method of changing the search range from a two-dimensional plane image to a one-dimensional straight line during the matching process.
  • the plane formed by the shooting baseline and any ground point is called the nuclear surface
  • the intersection line between the nuclear surface and the image surface is called the nuclear line.
  • the image points with the same name must be on the epipolar line of the same name, and the image points on the epipolar line of the same name have a one-to-one correspondence.
  • an epipolar pair with the same name can be determined on a stereo image pair, then using the above-mentioned properties of the epipolar pair with the same name, the search and matching of the two-dimensional image can be transformed into the search and matching along the epipolar line.
  • the epipolar image eliminates the upper and lower parallax between the stereo images, narrows the search range, reduces the amount of matching calculations, and improves the matching accuracy, so it is of great significance for dense stereo image matching.
  • a preferred embodiment of the present application discloses a method for making and matching epipolar images for generating epipolar images and calculating the degree of overlap between the epipolar images.
  • the method includes: (a) using the SIFT algorithm to compare the After image pair point feature extraction, uniformly distributed high-precision points with the same name are obtained, the basic matrix estimation based on the RANSAC strategy is used to obtain the basic matrix of n 1 E image pairs; (b) using the basic matrix to determine each group of points with the same name Corresponding core line with the same name; (c) According to the principle that the core line must intersect at the core point, the least square method is used to determine The core point coordinates of the image pair are used to generate a quick mapping of the epipolar lines between the images according to the core point coordinates, and the epipolar line is resampled by bilinear interpolation along the epipolar line direction to complete the epipolar image production and matching regeneration There are a total of n 2 E image pairs to generate the second image matching set.
  • the epipolar image production method based on the basic matrix can avoid the problems of iterative calculation and initial value assignment when calculating the relative relationship, and it can also have good accuracy when the aerial photography uses a large angle of view.
  • a rotation matrix and a projection matrix are constructed, and the rotation matrix is divided into x, y, and z axes, each of which is among them, Is expressed as follows:
  • the least square method is used to determine the core point coordinates (x p , y p ) of the image from the calculation result of the projection matrix.
  • mapping is to directly correct the epipolar image on the central projection image.
  • the specific steps of the mapping are: derive the central projection according to the collinear condition equation, calculate the angle relationship between two adjacent epipolar lines when the epipolar lines are sampled, and complete the determination of each epipolar line on the central projection image; use The basic matrix generated above can determine each image pair The corresponding core lines are based on The nuclear point coordinates of each nuclear image in the image determine the epipolar line of the point, and complete the epipolar line correspondence between the same image pair to obtain The epipolar equation is:
  • (x p , y p ) are the coordinates of the core point calculated above, (x base , y base ) are the reference coordinates of the center projection image, and the same goes for The epipolar equation.
  • each image pair is established After the epipolar line mapping, according to the resampling rule of the bilinear interpolation method, the epipolar line image is generated and the overlap degree is calculated. The image pairs whose overlap degree of the epipolar line image is less than the threshold N 2 are discarded to generate There are n 2 E image pairs in total, and the second image matching set E 2 is obtained .
  • the two-dimensional image restores the geometric structure of the three-dimensional object surface, so further processing is required, and the following S30 is executed.
  • the preferred embodiment of the present application adopts a dense matching algorithm, and on the basis of the second image matching set E 2 , the corner detection algorithm of the least equivalent segmentation absorbing core is used to extract respectively
  • the corner points of the corner points form a set of matching points of the corner points; combined with the epipolar geometric constraints, dynamic programming algorithms and other methods to establish Dense matching of all pixels between images. Specific steps are as follows:
  • the second image matching set of corner points in an image in E 2 (a) detection is the second image matching set of corner points in an image in E 2 (a) detection.
  • the corner point is the point where the local curvature changes the most on the contour of the image. It contains important information in the image, so it is of great significance for the detail matching of the image in the image set E 2 .
  • the preferred embodiment of the present application adopts the smallest uni-value segment assimilating nucleus (SUSAN) method to detect image corners: Gaussian smoothing is performed on the input image; each pixel in the image is traversed, and Sobel is used first The operator (it is a discrete first-order difference operator, used to calculate the approximate value of a gradient of the image brightness function) to determine whether the pixel is an edge point; if it is an edge point, the loss function according to the global energy equation is the smallest To determine whether the path loss L r (p, d) of the corner point is minimized, the determination principle is as follows:
  • C(p,d) is the local loss of the path
  • L r ,min(pr) is the minimum loss of the previous step of the path, from which it can be determined whether the point is a corner point, and redundant corner points are removed; further judge if the two corners are detected If the points are adjacent, the corner points with larger L r (p,d) are removed.
  • the automatic matching of corner points can effectively distinguish the difference between image pairs based on the similarities and differences of the corner points, which is an effective means to achieve precise matching.
  • the corner point automatic matching can be divided into these steps: 1For Each point in the set of image corners is in Find matching corner points in the corresponding search area in the image, similarly, for Each point in the set of image corners is searched in the same way Find their corresponding matching points in the image, and call the intersection of these two matching point sets the initial matching point set K 1 ; 2In the initial matching point set K 1 , the pair is The corner point where the diagonal points of the image are concentrated with each other, find the matching point in the corresponding search area, calculate the similarity between this point and each candidate matching point in the search area, and select the candidate matching point with the greatest similarity as its match point.
  • the similarity calculation method adopts the gradient size similarity method: if the gradient size of a pixel is g, and the gradient of another pixel that matches it approximately obeys the normal distribution, then the two The similarity l g of each pixel is:
  • d(x) is the density function and k is the density coefficient.
  • the dense matching includes: 1 Obtain the matching point set K 2 according to the limit geometric constraint relationship
  • the polar line correspondence of the image is obtained, and the limit correspondence set K 3 is obtained .
  • the so-called limit geometric constraint relationship means that if l and l'are the two corresponding epipolar lines in the left and right images, the corresponding point of the point p on the epipolar line l in the left image in the right image must be on the epipolar line l' on.
  • the polar lines in K 3 are segmented according to gray levels. Each polar line is divided into several gray segments, and the gray values of pixels on each segment are similar. .
  • the formula for gray scale segmentation is as follows:
  • the physical meaning of the above formula is to divide the continuous points of gray value in a certain range into one section.
  • I (x t , y t ) is the gray value of the pixel (x t , y t );
  • w is the number of pixels on a gray segment, that is, the length of the gray segment;
  • T is For a certain threshold, the smaller the value, the fewer pixels are divided into a certain gray-scale segment, and the more gray-scale segments.
  • the matching effect is the best when T is set to 3.
  • the gray-level segment set of is K 4 ; 3Using dynamic programming algorithm (an optimization method to find the best matching path) to establish the correspondence between gray-level segments, and using linear interpolation to save the corresponding gray-level The corresponding relationship between the pixels is established between the segments, so that the dense matching of all the pixels between the images is realized, and the third image matching set E 3 is obtained .
  • the preset 3Dmax software can be used to reconstruct the scene to restore the three-dimensional geometry of the scene space Information to get the reconstructed image.
  • the application also provides an image matching device.
  • FIG. 2 it is a schematic diagram of the internal structure of an image matching device provided by an embodiment of this application.
  • the image matching device 1 may be a PC (Personal Computer, personal computer), or a terminal device such as a smart phone, a tablet computer, or a portable computer.
  • the image matching device 1 at least includes a memory 11, a processor 12, a communication bus 13, and a network interface 14.
  • the memory 11 includes at least one type of readable storage medium, and the readable storage medium includes flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, and the like.
  • the memory 11 may be an internal storage unit of the image matching device 1 in some embodiments, such as a hard disk of the image matching device 1.
  • the memory 11 may also be an external storage device of the image matching device 1, such as a plug-in hard disk, a smart media card (SMC), and a secure digital (Secure Digital, SD card, Flash Card, etc.
  • the memory 11 may also include both an internal storage unit of the image matching device 1 and an external storage device.
  • the memory 11 can be used not only to store application software and various data installed in the image matching device 1, such as the code of the image matching program 01, etc., but also to temporarily store data that has been output or will be output.
  • the processor 12 may be a central processing unit (CPU), controller, microcontroller, microprocessor or other data processing chip in some embodiments, and is used to run the program code or processing stored in the memory 11 Data, such as execution of image matching program 01, etc.
  • CPU central processing unit
  • controller microcontroller
  • microprocessor or other data processing chip in some embodiments, and is used to run the program code or processing stored in the memory 11 Data, such as execution of image matching program 01, etc.
  • the communication bus 13 is used to realize the connection and communication between these components.
  • the network interface 14 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface), and is usually used to establish a communication connection between the device 1 and other electronic devices.
  • the device 1 may also include a user interface.
  • the user interface may include a display (Display) and an input unit such as a keyboard (Keyboard).
  • the optional user interface may also include a standard wired interface and a wireless interface.
  • the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode, organic light emitting diode) touch device, etc.
  • the display can also be called a display screen or a display unit as appropriate, and is used to display the information processed in the image matching device 1 and to display a visualized user interface.
  • FIG. 2 only shows the image matching device 1 with components 11-14 and the image matching program 01.
  • FIG. 1 does not constitute a limitation on the image matching device 1, and may include Fewer or more components than shown, or some combination of components, or different component arrangement.
  • the image matching program 01 is stored in the memory 11; when the processor 12 executes the image matching program 01 stored in the memory 11, the following steps are implemented:
  • the first step is to generate an image imaging map according to the scene image taken by the aerial camera, and use the scale-invariant feature transformation method to perform the initial image matching on the image imaging map to generate the initial image matching set.
  • the scene images captured by aerial equipment such as drones, helicopters and other flight control systems, have a large number of images and a wide viewing angle, especially buildings, which are dense in number. Therefore, this application first restores the overlapping image sets to their respective positions, and reconstructs the imaging map of the object.
  • the model formula for object imaging under aerial photography is used to recover overlapping scene images.
  • Position generate image imaging map.
  • model formula of the object imaging is as follows:
  • s is the scale factor
  • m is the coordinate of the image point
  • M is the coordinate of the object point (the object point and the image point are the object position and the image position in the optical imaging respectively)
  • K is the parameter matrix in the aerial photography tool, It is composed of focal length and principal point coordinates
  • R is a rotation matrix, which can be converted to approximate values according to the yaw, pitch, and roll recorded by the aerial tool’s system
  • C is the projection center
  • the position vector can be approximated directly from the longitude, latitude, and altitude recorded by the GPS of the aerial photography tool
  • I is the third-order unit matrix.
  • the imaging map of n images can be obtained.
  • G n (V n , E n), wherein, referred to as a vertex set V n, En is called an edge set (a graph is a widely used data structure.
  • the nodes in the graph are called vertices.
  • the relationship between two vertices can be represented by a pair, called an edge. If the graph represents an edge Even pairs are ordered, then the graph is called a directed graph, if the pairs representing edges are disordered, then it is called an undirected graph).
  • No set of edges E in the drawing represents the number nE E n-side is, for each table represents one image, on behalf of the E nE n-th image for subsequent image matching process performed between only one image pair nE . If the relationship between images is not considered and the exhaustive traversal strategy is used for image matching, the total number of matches is Usually n*(n-1)/2 will be much larger than nE.
  • the method of constructing the image relationship undirected graph limits the scope of image matching, can avoid blind image matching, reduce the total image matching calculation complexity from O(n 2 ) to O(n), and improve the matching calculation efficiency ; At the same time, it can effectively eliminate the interference of unrelated image pairs, fundamentally avoid mismatches caused by non-overlapping images, and improve the accuracy of matching and the robustness of reconstruction.
  • the scale-invariant feature transform (SIFT) algorithm is used for image matching.
  • SIFT scale-invariant feature transform
  • image matching if there are few matching points in the two images I i and I j , which are smaller than the threshold N 1 , it means that the overlap is small or the correlation is weak, and (I i , I j ) is removed from the set E. If the number of matching points in the two images I i and I j is greater than the threshold N 1 , then the imaging image pair is retained to generate There are n 1 E image pairs in total, and the first image matching set E 1 is generated.
  • the second step is to generate an epipolar image based on the initial image matching set and calculate the degree of overlap between the epipolar images to complete the second image matching and generate a second image matching set.
  • the above-mentioned first step is only to filter out images with no repetition or low repetition.
  • this application continues to use the epipolar image method to perform matching filtering.
  • the epipolar image is a method of changing the search range from a two-dimensional plane image to a one-dimensional straight line during the matching process.
  • the plane formed by the shooting baseline and any ground point is called the nuclear surface
  • the intersection line between the nuclear surface and the image surface is called the nuclear line.
  • the image points with the same name must be on the epipolar line of the same name, and the image points on the epipolar line of the same name have a one-to-one correspondence.
  • an epipolar pair with the same name can be determined on a stereo image pair, then using the above-mentioned properties of the epipolar pair with the same name, the search and matching of the two-dimensional image can be transformed into the search and matching along the epipolar line.
  • the epipolar image eliminates the upper and lower parallax between the stereo images, narrows the search range, reduces the amount of matching calculations, and improves the matching accuracy, so it is of great significance for dense stereo image matching.
  • a preferred embodiment of the present application discloses a method for making and matching epipolar images for generating epipolar images and calculating the degree of overlap between the epipolar images.
  • the method includes: (a) using the SIFT algorithm to compare the After image pair point feature extraction, uniformly distributed high-precision points with the same name are obtained, the basic matrix estimation based on the RANSAC strategy is used to obtain the basic matrix of n 1 E image pairs; (b) using the basic matrix to determine each group of points with the same name Corresponding core line with the same name; (c) According to the principle that the core line must intersect at the core point, the least square method is used to determine The core point coordinates of the image pair are used to generate a quick mapping of the epipolar lines between the images according to the core point coordinates, and the epipolar line is resampled by bilinear interpolation along the epipolar line direction to complete the epipolar image production and matching regeneration There are a total of n 2 E image pairs to generate the second image matching set.
  • the epipolar image production method based on the basic matrix can avoid the problems of iterative calculation and initial value assignment when calculating the relative relationship, and it can also have good accuracy when the aerial photography uses a large angle of view.
  • a rotation matrix and a projection matrix are constructed, and the rotation matrix is divided into x, y, and z axes, each of which is among them, Is expressed as follows:
  • the least square method is used to determine the core point coordinates (x p , y p ) of the image from the calculation result of the projection matrix.
  • mapping is to directly correct the epipolar image on the central projection image.
  • the specific steps of the mapping are: derive the central projection according to the collinear condition equation, calculate the angle relationship between two adjacent epipolar lines when the epipolar lines are sampled, and complete the determination of each epipolar line on the central projection image; use The basic matrix generated above can determine each image pair The corresponding core lines are based on The nuclear point coordinates of each nuclear image in the image determine the epipolar line of the point, and complete the epipolar line correspondence between the same image pair to obtain The epipolar equation is:
  • (x p , y p ) are the coordinates of the core point calculated above, (x base , y base ) are the reference coordinates of the center projection image, and the same goes for The epipolar equation.
  • each image pair is established After the epipolar line mapping, according to the resampling rule of the bilinear interpolation method, the epipolar line image is generated and the overlap degree is calculated. The image pairs whose overlap degree of the epipolar line image is less than the threshold N 2 are discarded to generate There are n 2 E image pairs in total, and the second image matching set E 2 is obtained .
  • the two-dimensional image restores the geometric structure of the three-dimensional object surface, so further processing is required, and the following third step is performed.
  • the preferred embodiment of the present application adopts a dense matching algorithm, and on the basis of the second image matching set E 2 , the corner detection algorithm of the least equivalent segmentation absorbing core is used to extract respectively
  • the corner points of the corner points form a set of matching points of the corner points; combined with the epipolar geometric constraints, dynamic programming algorithms and other methods to establish Dense matching of all pixels between images. Specific steps are as follows:
  • the second image matching set of corner points in an image in E 2 (a) detection is the second image matching set of corner points in an image in E 2 (a) detection.
  • the corner point is the point where the local curvature changes the most on the contour of the image. It contains important information in the image, so it is of great significance for the detail matching of the image in the image set E 2 .
  • the preferred embodiment of the present application adopts the smallest uni-value segment assimilating nucleus (SUSAN) method to detect image corners: Gaussian smoothing is performed on the input image; each pixel in the image is traversed, and Sobel is used first The operator (it is a discrete first-order difference operator, used to calculate the approximate value of a gradient of the image brightness function) to determine whether the pixel is an edge point; if it is an edge point, the loss function according to the global energy equation is the smallest To determine whether the path loss L r (p, d) of the corner point is minimized, the determination principle is as follows:
  • C(p,d) is the local loss of the path
  • L r , min(pr) is the minimum loss of the previous step of the path, from which it can be determined whether the point is a corner point, and redundant corner points are removed; further judge if the two corners are detected If the points are adjacent, the corner points with larger L r (p,d) are removed.
  • the automatic matching of corner points can effectively distinguish the difference between image pairs based on the similarities and differences of the corner points, which is an effective means to achieve precise matching.
  • the corner point automatic matching can be divided into these steps: 1For Each point in the set of image corners is in Find matching corner points in the corresponding search area in the image, similarly, for Each point in the set of image corners is searched in the same way Find their corresponding matching points in the image, and call the intersection of these two matching point sets the initial matching point set K 1 ; 2In the initial matching point set K 1 , the pair is The corner point where the diagonal points of the image are concentrated with each other, find the matching point in the corresponding search area, calculate the similarity between this point and each candidate matching point in the search area, and select the candidate matching point with the greatest similarity as its match point.
  • the similarity calculation method adopts the gradient size similarity method: if the gradient size of a pixel is g, and the gradient of another pixel that matches it approximately obeys the normal distribution, then the two The similarity l g of each pixel is:
  • d(x) is the density function
  • k is the density coefficient
  • the dense matching includes: 1 Obtain the matching point set K 2 according to the limit geometric constraint relationship
  • the polar line correspondence of the image is obtained, and the limit correspondence set K 3 is obtained .
  • the so-called limit geometric constraint relationship means that if l and l'are the two corresponding epipolar lines in the left and right images, the corresponding point of the point p on the epipolar line l in the left image in the right image must be on the epipolar line l' on.
  • the polar lines in K 3 are segmented according to gray levels. Each polar line is divided into several gray segments, and the gray values of pixels on each segment are similar. .
  • the formula for gray scale segmentation is as follows:
  • the physical meaning of the above formula is to divide the continuous points of gray value in a certain range into one section.
  • I (x t , y t ) is the gray value of the pixel (x t , y t );
  • w is the number of pixels on a gray segment, that is, the length of the gray segment;
  • T is For a certain threshold, the smaller the value, the fewer pixels are divided into a certain gray-scale segment, and the more gray-scale segments.
  • the matching effect is the best when T is set to 3.
  • the gray-level segment set of is K 4 ; 3Using dynamic programming algorithm (an optimization method to find the best matching path) to establish the correspondence between gray-level segments, and using linear interpolation to save the corresponding gray-level The corresponding relationship between the pixels is established between the segments, so that the dense matching of all the pixels between the images is realized, and the third image matching set E 3 is obtained .
  • the preset 3Dmax software can be used to reconstruct the scene to restore the three-dimensional geometry of the scene space Information to get the reconstructed image.
  • the image matching program may also be divided into one or more modules, and the one or more modules are stored in the memory 11 and are executed by one or more processors (in this embodiment, processing The module 12) is executed to complete the application.
  • the module referred to in the application refers to a series of computer program instruction segments capable of completing specific functions, and is used to describe the execution process of the image matching program in the image matching device.
  • FIG. 3 it is a schematic diagram of the program modules of the image matching program in an embodiment of the image matching device of the present application.
  • the image matching program can be divided into a primary matching module 10, a secondary matching module 20,
  • the three-time matching module 30 and the reconstruction module 40 are exemplary:
  • the first matching module 10 is used to generate an image imaging map according to the scene image taken by the aerial camera, and use the scale-invariant feature transformation method to perform the first image matching on the image imaging map to generate a first image matching set.
  • the secondary matching module 20 is configured to generate epipolar images based on the primary image matching set and calculate the degree of overlap between the epipolar images, complete the second image matching, and generate a second image matching set.
  • the third-order matching module 30 is configured to: based on the second-order image matching set, establish dense matching of all pixels between the images, and generate a third-order image matching set.
  • the reconstruction module 40 is configured to perform three-dimensional reconstruction according to the image matching set to obtain a reconstructed scene image.
  • an embodiment of the present application also proposes a computer-readable storage medium having an image matching program stored on the computer-readable storage medium, and the image matching program can be executed by one or more processors to implement the following operations:
  • the image imaging map is generated, and the first image matching is performed on the image imaging map using the scale-invariant feature transformation method to generate the first image matching set;
  • a dense matching of all pixels between images is established, a third image matching set is generated, and three-dimensional reconstruction is performed to obtain a reconstructed scene image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

La présente invention porte sur un procédé et sur un dispositif (1) de mise en correspondance d'images, ainsi que sur un support de mémoire lisible par ordinateur. Le procédé consiste : à générer une image d'imageur selon une image de scène capturée par un instrument de photographie aérienne, à effectuer une mise en correspondance des images primaires sur l'image d'imageur en utilisant un procédé de transformation de caractéristique invariant à l'échelle, et à générer un ensemble de mise en correspondance d'images primaires (S1) ; sur la base de l'ensemble de mise en correspondance d'images primaires, à générer des images de lignes épi-polaires, à calculer le degré de chevauchement entre les images de lignes épi-polaires, à achever la mise en correspondance des images secondaires et à générer un ensemble de mise en correspondance d'images secondaires (S2) ; sur la base de l'ensemble de mise en correspondance des images secondaires, à établir une correspondance dense de tous les points de pixel entre les images, à générer un ensemble de mise en correspondance d'images de troisième temps et à exécuter une reconstruction tridimensionnelle pour obtenir une image de scène reconstruite (S3). Selon la nouvelle solution de mise en correspondance d'images appliquée à une scène tridimensionnelle dense sous photographie aérienne, l'efficacité de mise en correspondance d'image peut être améliorée.
PCT/CN2019/102187 2019-04-08 2019-08-23 Procédé et dispositif de mise en correspondance d'images et support de mémoire lisible par ordinateur WO2020206903A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910274078.5 2019-04-08
CN201910274078.5A CN110135455B (zh) 2019-04-08 2019-04-08 影像匹配方法、装置及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2020206903A1 true WO2020206903A1 (fr) 2020-10-15

Family

ID=67569487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/102187 WO2020206903A1 (fr) 2019-04-08 2019-08-23 Procédé et dispositif de mise en correspondance d'images et support de mémoire lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN110135455B (fr)
WO (1) WO2020206903A1 (fr)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111160377A (zh) * 2020-03-07 2020-05-15 深圳移动互联研究院有限公司 一种自带密钥机制的图像采集系统及其佐证方法
CN112233228A (zh) * 2020-10-28 2021-01-15 五邑大学 基于无人机的城市三维重建方法、装置及存储介质
CN112381864A (zh) * 2020-12-08 2021-02-19 兰州交通大学 一种基于对极几何的多源多尺度高分辨率遥感影像自动配准技术
CN112446951A (zh) * 2020-11-06 2021-03-05 杭州易现先进科技有限公司 三维重建方法、装置、电子设备及计算机存储介质
CN112509109A (zh) * 2020-12-10 2021-03-16 上海影创信息科技有限公司 一种基于神经网络模型的单视图光照估计方法
CN113096168A (zh) * 2021-03-17 2021-07-09 西安交通大学 一种结合sift点和控制线对的光学遥感图像配准方法及系统
CN113741510A (zh) * 2021-07-30 2021-12-03 深圳创动科技有限公司 一种巡检路径规划方法、装置以及存储介质
CN113867410A (zh) * 2021-11-17 2021-12-31 武汉大势智慧科技有限公司 一种无人机航拍数据的采集模式识别方法和系统
CN113963132A (zh) * 2021-11-15 2022-01-21 广东电网有限责任公司 一种等离子体的三维分布重建方法及相关装置
CN114140575A (zh) * 2021-10-21 2022-03-04 北京航空航天大学 三维模型构建方法、装置和设备
CN114332349A (zh) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 一种双目结构光边缘重建方法、系统及存储介质
CN114419116A (zh) * 2022-01-11 2022-04-29 江苏省测绘研究所 一种基于网匹配的遥感影像配准方法及其系统
CN114758151A (zh) * 2022-03-21 2022-07-15 辽宁工程技术大学 一种结合线特征与三角网约束的序列影像密集匹配方法
CN114972536A (zh) * 2022-05-26 2022-08-30 中国人民解放军战略支援部队信息工程大学 一种航空面阵摆扫式相机定位和标定方法
CN115063460A (zh) * 2021-12-24 2022-09-16 山东建筑大学 一种高精度自适应同名像素插值与优化方法
CN115661368A (zh) * 2022-12-14 2023-01-31 海纳云物联科技有限公司 一种图像匹配方法、装置、服务器及存储介质
CN116596844A (zh) * 2023-04-06 2023-08-15 北京四维远见信息技术有限公司 一种航飞质量检查方法、装置、设备及存储介质
CN116597184A (zh) * 2023-07-11 2023-08-15 中国人民解放军63921部队 最小二乘影像匹配方法
CN116612067A (zh) * 2023-04-06 2023-08-18 北京四维远见信息技术有限公司 航飞质量检查方法、装置、设备和计算机可读存储介质
CN117664087A (zh) * 2024-01-31 2024-03-08 中国人民解放军战略支援部队航天工程大学 垂轨环扫式卫星影像核线生成方法、系统及设备
CN118070434A (zh) * 2024-04-22 2024-05-24 天津悦鸣腾宇通用机械设备有限公司 一种汽车零部件的工艺信息模型构建方法及系统

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135455B (zh) * 2019-04-08 2024-04-12 平安科技(深圳)有限公司 影像匹配方法、装置及计算机可读存储介质
CN111046906B (zh) * 2019-10-31 2023-10-31 中国资源卫星应用中心 一种面状特征点可靠加密匹配方法和系统
CN112866504B (zh) * 2021-01-28 2023-06-09 武汉博雅弘拓科技有限公司 一种空三加密方法和系统
CN114742869B (zh) * 2022-06-15 2022-08-16 西安交通大学医学院第一附属医院 基于图形识别的脑部神经外科配准方法及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110090337A1 (en) * 2008-02-01 2011-04-21 Imint Image Intelligence Ab Generation of aerial images
CN104751451A (zh) * 2015-03-05 2015-07-01 同济大学 基于无人机低空高分辨率影像的密集点云提取方法
CN105847750A (zh) * 2016-04-13 2016-08-10 中测新图(北京)遥感技术有限责任公司 基于地理编码的无人机视频影像实时显示的方法及装置
CN106023086A (zh) * 2016-07-06 2016-10-12 中国电子科技集团公司第二十八研究所 一种基于orb特征匹配的航拍影像及地理数据拼接方法
CN108759788A (zh) * 2018-03-19 2018-11-06 深圳飞马机器人科技有限公司 无人机影像定位定姿方法及无人机
CN110135455A (zh) * 2019-04-08 2019-08-16 平安科技(深圳)有限公司 影像匹配方法、装置及计算机可读存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915965A (zh) * 2014-03-14 2015-09-16 华为技术有限公司 一种摄像机跟踪方法及装置
CN107492127B (zh) * 2017-09-18 2021-05-11 丁志宇 光场相机参数标定方法、装置、存储介质和计算机设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110090337A1 (en) * 2008-02-01 2011-04-21 Imint Image Intelligence Ab Generation of aerial images
CN104751451A (zh) * 2015-03-05 2015-07-01 同济大学 基于无人机低空高分辨率影像的密集点云提取方法
CN105847750A (zh) * 2016-04-13 2016-08-10 中测新图(北京)遥感技术有限责任公司 基于地理编码的无人机视频影像实时显示的方法及装置
CN106023086A (zh) * 2016-07-06 2016-10-12 中国电子科技集团公司第二十八研究所 一种基于orb特征匹配的航拍影像及地理数据拼接方法
CN108759788A (zh) * 2018-03-19 2018-11-06 深圳飞马机器人科技有限公司 无人机影像定位定姿方法及无人机
CN110135455A (zh) * 2019-04-08 2019-08-16 平安科技(深圳)有限公司 影像匹配方法、装置及计算机可读存储介质

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111160377A (zh) * 2020-03-07 2020-05-15 深圳移动互联研究院有限公司 一种自带密钥机制的图像采集系统及其佐证方法
CN112233228B (zh) * 2020-10-28 2024-02-20 五邑大学 基于无人机的城市三维重建方法、装置及存储介质
CN112233228A (zh) * 2020-10-28 2021-01-15 五邑大学 基于无人机的城市三维重建方法、装置及存储介质
CN112446951A (zh) * 2020-11-06 2021-03-05 杭州易现先进科技有限公司 三维重建方法、装置、电子设备及计算机存储介质
CN112446951B (zh) * 2020-11-06 2024-03-26 杭州易现先进科技有限公司 三维重建方法、装置、电子设备及计算机存储介质
CN112381864A (zh) * 2020-12-08 2021-02-19 兰州交通大学 一种基于对极几何的多源多尺度高分辨率遥感影像自动配准技术
CN112509109A (zh) * 2020-12-10 2021-03-16 上海影创信息科技有限公司 一种基于神经网络模型的单视图光照估计方法
CN113096168A (zh) * 2021-03-17 2021-07-09 西安交通大学 一种结合sift点和控制线对的光学遥感图像配准方法及系统
CN113096168B (zh) * 2021-03-17 2024-04-02 西安交通大学 一种结合sift点和控制线对的光学遥感图像配准方法及系统
CN113741510A (zh) * 2021-07-30 2021-12-03 深圳创动科技有限公司 一种巡检路径规划方法、装置以及存储介质
CN114140575A (zh) * 2021-10-21 2022-03-04 北京航空航天大学 三维模型构建方法、装置和设备
CN113963132A (zh) * 2021-11-15 2022-01-21 广东电网有限责任公司 一种等离子体的三维分布重建方法及相关装置
CN114332349A (zh) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 一种双目结构光边缘重建方法、系统及存储介质
CN113867410B (zh) * 2021-11-17 2023-11-03 武汉大势智慧科技有限公司 一种无人机航拍数据的采集模式识别方法和系统
CN114332349B (zh) * 2021-11-17 2023-11-03 浙江视觉智能创新中心有限公司 一种双目结构光边缘重建方法、系统及存储介质
CN113867410A (zh) * 2021-11-17 2021-12-31 武汉大势智慧科技有限公司 一种无人机航拍数据的采集模式识别方法和系统
CN115063460A (zh) * 2021-12-24 2022-09-16 山东建筑大学 一种高精度自适应同名像素插值与优化方法
CN114419116A (zh) * 2022-01-11 2022-04-29 江苏省测绘研究所 一种基于网匹配的遥感影像配准方法及其系统
CN114419116B (zh) * 2022-01-11 2024-04-09 江苏省测绘研究所 一种基于网匹配的遥感影像配准方法及其系统
CN114758151A (zh) * 2022-03-21 2022-07-15 辽宁工程技术大学 一种结合线特征与三角网约束的序列影像密集匹配方法
CN114758151B (zh) * 2022-03-21 2024-05-24 辽宁工程技术大学 一种结合线特征与三角网约束的序列影像密集匹配方法
CN114972536B (zh) * 2022-05-26 2023-05-09 中国人民解放军战略支援部队信息工程大学 一种航空面阵摆扫式相机定位和标定方法
CN114972536A (zh) * 2022-05-26 2022-08-30 中国人民解放军战略支援部队信息工程大学 一种航空面阵摆扫式相机定位和标定方法
CN115661368A (zh) * 2022-12-14 2023-01-31 海纳云物联科技有限公司 一种图像匹配方法、装置、服务器及存储介质
CN115661368B (zh) * 2022-12-14 2023-04-11 海纳云物联科技有限公司 一种图像匹配方法、装置、服务器及存储介质
CN116596844A (zh) * 2023-04-06 2023-08-15 北京四维远见信息技术有限公司 一种航飞质量检查方法、装置、设备及存储介质
CN116612067B (zh) * 2023-04-06 2024-02-23 北京四维远见信息技术有限公司 航飞质量检查方法、装置、设备和计算机可读存储介质
CN116596844B (zh) * 2023-04-06 2024-03-29 北京四维远见信息技术有限公司 一种航飞质量检查方法、装置、设备及存储介质
CN116612067A (zh) * 2023-04-06 2023-08-18 北京四维远见信息技术有限公司 航飞质量检查方法、装置、设备和计算机可读存储介质
CN116597184B (zh) * 2023-07-11 2023-09-22 中国人民解放军63921部队 最小二乘影像匹配方法
CN116597184A (zh) * 2023-07-11 2023-08-15 中国人民解放军63921部队 最小二乘影像匹配方法
CN117664087A (zh) * 2024-01-31 2024-03-08 中国人民解放军战略支援部队航天工程大学 垂轨环扫式卫星影像核线生成方法、系统及设备
CN117664087B (zh) * 2024-01-31 2024-04-02 中国人民解放军战略支援部队航天工程大学 垂轨环扫式卫星影像核线生成方法、系统及设备
CN118070434A (zh) * 2024-04-22 2024-05-24 天津悦鸣腾宇通用机械设备有限公司 一种汽车零部件的工艺信息模型构建方法及系统

Also Published As

Publication number Publication date
CN110135455A (zh) 2019-08-16
CN110135455B (zh) 2024-04-12

Similar Documents

Publication Publication Date Title
WO2020206903A1 (fr) Procédé et dispositif de mise en correspondance d'images et support de mémoire lisible par ordinateur
TWI777538B (zh) 圖像處理方法、電子設備及電腦可讀儲存介質
US11928800B2 (en) Image coordinate system transformation method and apparatus, device, and storage medium
CN107705333B (zh) 基于双目相机的空间定位方法及装置
US11521311B1 (en) Collaborative disparity decomposition
CA2826534C (fr) Points de remplissage dans un nuage de points
WO2015135323A1 (fr) Procédé et dispositif de poursuite par caméra
EP3274964B1 (fr) Connexion automatique d'images au moyen de caractéristiques visuelles
US9286539B2 (en) Constructing contours from imagery
CN111127524A (zh) 一种轨迹跟踪与三维重建方法、系统及装置
CN112686877B (zh) 基于双目相机的三维房屋损伤模型构建测量方法及系统
WO2021004416A1 (fr) Procédé et appareil permettant d'établir une carte de balises sur la base de balises visuelles
US20160163114A1 (en) Absolute rotation estimation including outlier detection via low-rank and sparse matrix decomposition
CN115439607A (zh) 一种三维重建方法、装置、电子设备及存储介质
WO2021244161A1 (fr) Procédé et appareil de génération de modèle basés sur une image panoramique multivue
WO2023024393A1 (fr) Procédé et appareil d'estimation de profondeur, dispositif informatique et support de stockage
KR101593316B1 (ko) 스테레오 카메라를 이용한 3차원 모델 재구성 방법 및 장치
WO2022237048A1 (fr) Procédé et appareil d'acquisition de pose, et dispositif électronique, support de stockage et programme
CN116129037B (zh) 视触觉传感器及其三维重建方法、系统、设备及存储介质
CN112150518B (zh) 一种基于注意力机制的图像立体匹配方法及双目设备
WO2021142843A1 (fr) Procédé et dispositif de balayage d'image, appareil et support de stockage
CN113436269B (zh) 图像稠密立体匹配方法、装置和计算机设备
CN112634366A (zh) 位置信息的生成方法、相关装置及计算机程序产品
CN112002007A (zh) 基于空地影像的模型获取方法及装置、设备、存储介质
Budianti et al. Background blurring and removal for 3d modelling of cultural heritage objects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19924318

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19924318

Country of ref document: EP

Kind code of ref document: A1