CN111415298A - Image splicing method and device, electronic equipment and computer readable storage medium - Google Patents
Image splicing method and device, electronic equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN111415298A CN111415298A CN202010201387.2A CN202010201387A CN111415298A CN 111415298 A CN111415298 A CN 111415298A CN 202010201387 A CN202010201387 A CN 202010201387A CN 111415298 A CN111415298 A CN 111415298A
- Authority
- CN
- China
- Prior art keywords
- edge
- image
- value
- map
- pixel point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 93
- 238000012545 processing Methods 0.000 claims abstract description 47
- 230000008569 process Effects 0.000 claims description 34
- 230000015654 memory Effects 0.000 claims description 19
- 238000004364 calculation method Methods 0.000 claims description 14
- 230000003247 decreasing effect Effects 0.000 claims description 12
- 238000000605 extraction Methods 0.000 claims description 4
- 239000000126 substance Substances 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 21
- 239000011159 matrix material Substances 0.000 description 13
- 238000013519 translation Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000000354 decomposition reaction Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The application discloses an image splicing method, an image splicing device, electronic equipment and a computer readable storage medium, and relates to the technical field of image processing. The specific implementation scheme is as follows: acquiring a first image and a second image to be spliced; extracting first edge information from the first image and second edge information from the second image; generating an edge density map of the first image according to the first edge information, and selecting a target edge area from the edge density map; generating an edge distance map of the second image according to the second edge information; matching the target edge area with the edge distance graph, and selecting a target matching area from the edge distance graph; the target matching region is a region with the minimum sum of values of pixel points corresponding to the target edge region; and splicing the first image and the second image according to the position relation between the target edge area and the target matching area. According to the scheme, the image splicing effect can be effectively improved, and the image effect after splicing is better.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image stitching method and apparatus, an electronic device, and a computer-readable storage medium.
Background
In the prior art, image stitching is generally realized by adopting a method of feature point extraction and matching. However, due to the fact that noise exists in the point coordinates of the image feature points, accurate matching between the images to be stitched cannot be achieved, and therefore the image stitching effect is poor.
Disclosure of Invention
The embodiment of the application provides an image splicing method, an image splicing device, electronic equipment and a computer readable storage medium, and aims to solve the problem that the existing image splicing method is poor in splicing effect.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image stitching method, including:
acquiring a first image and a second image to be spliced;
extracting first edge information from the first image and second edge information from the second image;
generating an edge density map of the first image according to the first edge information, and selecting a target edge area from the edge density map;
generating an edge distance map of the second image according to the second edge information;
matching the target edge area with the edge distance map, and selecting a target matching area from the edge distance map; the target matching area is an area with the smallest sum of values of pixel points corresponding to the target edge area in the edge distance graph;
and splicing the first image and the second image according to the position relation between the target edge area and the target matching area.
Therefore, image splicing can be achieved by means of the edge matching process, and compared with the existing image splicing method, poor image splicing effect caused by the fact that noise exists in the point coordinates of the image feature points and the like can be avoided, so that the image splicing effect is effectively improved, and the image effect after splicing is better.
Optionally, the generating an edge density map of the first image according to the first edge information includes:
generating an edge binarization image of the first image according to the first edge information; in the edge binary image, the value of an edge pixel point is 1, and the value of a non-edge pixel point is 0;
respectively counting the number of edge pixel points around each edge pixel point in the edge binary image by using a preset sliding window;
and taking the number of the edge pixel points around each edge pixel point as the edge density value of each edge pixel point, and replacing the value of each edge pixel point with the corresponding edge density value to obtain the edge density graph.
Therefore, the number of the edge pixel points around each edge pixel point is used as the edge density value of each edge pixel point, so that the region with obvious edge characteristics can be found, and the accuracy of subsequent edge matching is improved.
Optionally, the step of respectively counting the number of edge pixel points around each edge pixel point in the edge binarization image by using a preset sliding window includes:
respectively aiming at each edge pixel point, executing the following processes:
respectively covering the edge pixel points by using preset sliding windows with different covering areas by taking the edge pixel points as centers, and calculating the density value of the edge pixel points in each sliding window;
and determining the maximum value in the density values of the edge pixels obtained by calculation as the number of the edge pixels around the edge pixels.
Therefore, by calculating the density value of the edge pixel points in each sliding window, the result can be normalized, so that the normalized result is ensured, and the accuracy of the number of the obtained edge pixel points around each edge pixel point is improved.
Optionally, the selecting a target edge region from the edge density map includes:
determining a maximum edge density value in the edge density map;
and determining the coverage area corresponding to the sliding window when the maximum edge density value is obtained through calculation as the target edge area.
Therefore, the edge pixel density of the edge feature salient region is high, so that the target edge region is determined by adopting the method, and the subsequent edge matching process can be facilitated.
Optionally, in the edge distance map, the value of each pixel point is the closest distance value between the pixel point and the edge pixel point.
Optionally, the generating an edge distance map of the second image according to the second edge information includes:
generating an edge map of the second image according to the second edge information; in the edge graph, the value of an edge pixel point is 0, and the value of a non-edge pixel point is a preset value larger than 0;
respectively processing the values of the pixel points in the edge map in sequence by using a first formula and a second formula to obtain the edge distance map;
vij=minimum(vi-1,j-1+d1,vi-1,j+d2,vi-1,j+1+d1,vi,j-1+d2,vij) Formula one
vij=minimum(vi+1,j-1+d1,vi+1,j+d2,vi+1,j+1+d1,vi,j+1+d2,vij) Formula two
Wherein i represents the row of the edge map, j represents the column of the edge map, vijThe method comprises the steps of representing values of pixel points in the ith row and the jth column, wherein in a formula I, the value range of i is gradually increased from 2 to H, the value range of j is gradually increased from 2 to L, in a formula II, the value range of i is gradually decreased from H-1 to 1, the value range of j is gradually decreased from L-1 to 1, H represents the row number of an edge graph, L represents the column number of the edge graph, and d1 and d2 represent preset values.
Therefore, the edge distance graph is obtained by means of the first formula and the second formula instead of directly calculating the distance between each pixel point and all edge pixel points, the process of obtaining the edge distance graph can be simplified, and the efficiency of obtaining the edge distance graph is improved.
Optionally, the acquiring the first image and the second image to be stitched includes:
performing feature point matching on a third image and a fourth image to obtain a spatial rotation relationship between the third image and the fourth image;
and processing the third image and the fourth image by utilizing the spatial rotation relationship to obtain the first image and the second image to be spliced which are positioned on the same plane.
Therefore, the image splicing effect can be improved on the premise of simplifying image processing by firstly utilizing the spatial rotation relation to process and then utilizing the translation relation to process.
In a second aspect, an embodiment of the present application provides an image stitching apparatus, including:
the acquisition module is used for acquiring a first image and a second image to be spliced;
an extraction module, configured to extract first edge information from the first image and extract second edge information from the second image;
the first processing module is used for generating an edge density map of the first image according to the first edge information and selecting a target edge area from the edge density map;
the generating module is used for generating an edge distance map of the second image according to the second edge information;
the second processing module is used for matching the target edge area with the edge distance map and selecting a target matching area from the edge distance map; the target matching area is an area with the smallest sum of values of pixel points corresponding to the target edge area in the edge distance graph;
and the splicing module is used for splicing the first image and the second image according to the position relation between the target edge area and the target matching area.
Optionally, the first processing module includes:
a first generating unit configured to generate an edge binarized image of the first image based on the first edge information; in the edge binary image, the value of an edge pixel point is 1, and the value of a non-edge pixel point is 0;
the statistical unit is used for respectively counting the number of edge pixel points around each edge pixel point in the edge binary image by utilizing a preset sliding window;
the first processing unit is used for taking the number of the edge pixel points around each edge pixel point as the edge density value of each edge pixel point, and replacing the value of each edge pixel point with the corresponding edge density value to obtain the edge density graph.
Optionally, the statistical unit is specifically configured to: respectively aiming at each edge pixel point, executing the following processes:
respectively covering the edge pixel points by using preset sliding windows with different covering areas by taking the edge pixel points as centers, and calculating the density value of the edge pixel points in each sliding window;
and determining the maximum value in the density values of the edge pixels obtained by calculation as the number of the edge pixels around the edge pixels.
Optionally, the first processing module includes:
a first determining unit, configured to determine a maximum edge density value in the edge density map;
and the second determining unit is used for determining the coverage area corresponding to the sliding window when the maximum edge density value is obtained through calculation as the target edge area.
Optionally, in the edge distance map, the value of each pixel point is the closest distance value between the pixel point and the edge pixel point.
Optionally, the generating module includes:
a second generating unit, configured to generate an edge map of the second image according to the second edge information; in the edge graph, the value of an edge pixel point is 0, and the value of a non-edge pixel point is a preset value larger than 0;
the second processing unit is used for processing the values of the pixel points in the edge map in sequence respectively by using a first formula and a second formula to obtain the edge distance map;
vij=minimum(vi-1,j-1+d1,vi-1,j+d2,vi-1,j+1+d1,vi,j-1+d2,vij) Formula one
vij=minimum(vi+1,j-1+d1,vi+1,j+d2,vi+1,j+1+d1,vi,j+1+d2,vij) Formula two
Wherein i represents the row of the edge map, j represents the column of the edge map, vijThe method comprises the steps of representing values of pixel points in the ith row and the jth column, wherein in a formula I, the value range of i is gradually increased from 2 to H, the value range of j is gradually increased from 2 to L, in a formula II, the value range of i is gradually decreased from H-1 to 1, the value range of j is gradually decreased from L-1 to 1, H represents the row number of an edge graph, L represents the column number of the edge graph, and d1 and d2 represent preset values.
Optionally, the obtaining module includes:
the matching unit is used for matching the feature points of a third image and a fourth image to obtain a spatial rotation relation between the third image and the fourth image;
and the third processing unit is used for processing the third image and the fourth image by utilizing the spatial rotation relationship to obtain the first image and the second image to be spliced which are positioned on the same plane.
In a third aspect, an embodiment of the present application further provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image stitching method as described above.
In a fourth aspect, the present application further provides a non-transitory computer-readable storage medium storing computer instructions, where the computer instructions are configured to cause the computer to execute the image stitching method as described above.
One embodiment in the above application has the following advantages or benefits: the image splicing can be realized by means of the edge matching process, so that compared with the existing image splicing method, the image splicing effect caused by the reasons that noise exists in the point coordinates of the image feature points and the like can be avoided to be poor, the image splicing effect is effectively improved, and the image effect after splicing is better. Because the method adopts the steps of acquiring a first image and a second image to be spliced, extracting first edge information from the first image, and extracting second edge information from the second image, generating an edge density map of the first image according to the first edge information, and selecting a target edge area from the edge density map, generating an edge distance map of the second image according to the second edge information, matching the target edge region with the edge distance map, selecting a target matching region from the edge distance map, the target matching region is a region in which the sum of values of pixel points corresponding to the target edge region is the smallest, a technical means for splicing the first image and the second image according to the position relation between the target edge area and the target matching area, therefore, the technical problem of poor image splicing effect in the prior art is solved, and the technical effect of effectively improving the image splicing effect is achieved.
Other effects of the above-described alternative will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a flow chart of an image stitching method according to an embodiment of the present application;
FIGS. 2a, 2b, 2c and 2d are schematic diagrams of edge distance maps obtained in the specific example of the present application;
FIG. 3 is a flow chart of an image stitching process in a specific example of the present application;
FIG. 4 is a block diagram of an image stitching device for implementing the image stitching method according to the embodiment of the present application;
fig. 5 is a block diagram of an electronic device for implementing an image stitching method according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1, fig. 1 is a flowchart of an image stitching method provided in an embodiment of the present application, where the method is applied to an electronic device, and as shown in fig. 1, the method includes the following steps:
step 101: and acquiring a first image and a second image to be spliced.
In this embodiment, the first image and the second image may be selected to be in the same plane, so as to realize stitching by determining the translation relationship between the first image and the second image.
In one embodiment, the first image and the second image may be obtained after rotational remapping, and the corresponding obtaining process includes: matching the feature points of the third image and the fourth image to obtain a spatial rotation relationship between the third image and the fourth image; and processing the third image and the fourth image by utilizing the spatial rotation relation to obtain a first image and a second image which are positioned on the same plane and are to be spliced. Wherein the third image and the fourth image are understood to be the original images to be stitched, such as the adjacent left image and the middle image, or the middle image and the right image.
Step 102: first edge information is extracted from the first image, and second edge information is extracted from the second image.
Optionally, in step 102, a canny algorithm may be used to extract edge information of the first image and the second image, which is not limited in this embodiment.
Step 103: and generating an edge density map of the first image according to the first edge information, and selecting a target edge area from the edge density map.
In this embodiment, the edge density map is mainly related to edge pixels, and may be generated by determining the number of edge pixels around each edge pixel. The target edge region may be selected as a region where the edge feature is significant.
Step 104: and generating an edge distance map of the second image according to the second edge information.
In this embodiment, the edge distance map is mainly related to the distance (e.g., the number of pixels apart) between each pixel in the second image and the edge pixel. Optionally, in the edge distance map, the value of each pixel may be the closest distance value between the pixel and the edge pixel, that is, the minimum value among the distance values (e.g., the number of pixels apart) between the pixel and all the edge pixels.
Step 105: and matching the target edge area with the edge distance map, and selecting a target matching area from the edge distance map.
The target matching region may be a region in which a sum of values of pixel points corresponding to the target edge region in the edge distance map is the smallest. The sum of the values of the pixel points in the target matching region is minimum, so that the edge error of the pixel points in the target matching region is minimum, and the pixel points are matched with the target edge region most.
In one embodiment, when matching the target edge region with the edge distance map, the target edge region may be slid on the edge distance map for matching, and a region where a sum of values of pixel points covered by the target edge region is the smallest in the sliding process is used as the target matching region.
Step 106: and splicing the first image and the second image according to the position relation between the target edge area and the target matching area.
As an optional implementation manner, when performing image stitching according to a position relationship between the target edge region and the target matching region, a translation matrix between the first image and the second image may be determined according to the position relationship, and then the first image and the second image may be stitched according to the translation matrix.
The image splicing method can realize image splicing by means of an edge matching process, and therefore compared with the existing image splicing method, poor image splicing effect caused by the fact that noise exists in point coordinates of image feature points and the like can be avoided, image splicing effect is effectively improved, and the image effect after splicing is better.
In this embodiment of the present application, the process of generating the edge density map in step 103 may include:
generating an edge binary image of the first image according to the first edge information; in the edge binary image, the value of an edge pixel point is 1, and the value of a non-edge pixel point is 0;
respectively counting the number of edge pixel points around each edge pixel point in the edge binary image by using a preset sliding window;
and taking the number of the edge pixel points around each edge pixel point as the edge density value of each edge pixel point, and replacing the value of each edge pixel point with the corresponding edge density value to obtain the edge density graph.
Therefore, the number of the edge pixel points around each edge pixel point is used as the edge density value of each edge pixel point, so that the area where the edge pixel points are gathered can be found, the area with obvious edge characteristics can be found, and the accuracy of follow-up edge matching is improved.
Optionally, the process of respectively counting the number of edge pixels around each edge pixel by using the preset sliding window may be as follows: respectively aiming at each edge pixel point, executing the following processes:
and covering the edge pixel points by using a sliding window with a preset length and a preset width by taking the edge pixel points as a center, and determining the number of the edge pixel points in the sliding window as the number of the edge pixel points around the edge pixel points. Therefore, the number of the edge pixel points around each edge pixel point can be conveniently determined.
In addition, the above-mentioned process of utilizing to predetermine the sliding window, statistics edge pixel point quantity around every edge pixel point respectively can also be: respectively aiming at each edge pixel point, executing the following processes:
respectively covering the edge pixel points by using preset sliding windows with different covering areas by taking the edge pixel points as centers, and calculating the density value of the edge pixel points in each sliding window; and determining the maximum value in the density values of the edge pixels obtained by calculation as the number of the edge pixels around the edge pixels. Therefore, by means of calculating the density value of the edge pixel points in each sliding window, the result can be normalized, so that the normalized result is ensured, and the accuracy of the number of the obtained edge pixel points around each edge pixel point is improved.
It should be noted that the preset number of sliding windows with different coverage areas may be obtained by increasing the length and width of the initial sliding window based on a preset step length. For example, if the length and width of the initial sliding window are 25% of the minimum value X of the length and width of the first image, and the preset step is 10% of the minimum value X, in the case of selecting the right 50% area of the search first image, 3 sliding windows may be used, whose lengths (same widths) are 25%, 35% and 45% of the minimum value X, respectively, and each edge pixel point in the right 50% area is covered respectively, and a formula is adoptedCalculating the density value of the edge pixel point in each sliding window and adopting a formulaDetermining the obtained maximum value as the number of edge pixel points around the corresponding edge pixel point, wherein n represents the number of the sliding windows, kn represents the number of the edge pixel points in the nth sliding window, L n2The area of the nth sliding window is shown.
Further, after obtaining the edge density map based on the preset sliding windows, the above process of selecting the target edge area from the edge density map may include: and determining the maximum edge density value in the edge density map, and determining the coverage area corresponding to the sliding window when the maximum edge density value is obtained through calculation as the target edge area. Therefore, because the density of the edge pixel points in the edge feature salient region is generally high, the method for determining the target edge region can be helpful for the subsequent edge matching process.
In an embodiment of the present application, to simplify the calculation process, the generating the edge distance map according to the second edge information may include:
generating an edge map of the second image according to the second edge information; in the edge graph, the value of the edge pixel is 0, and the value of the non-edge pixel is a preset value (such as 1, 5, 10, 200, etc.) greater than 0;
respectively processing the values of the pixel points in the edge map in sequence by using a first formula and a second formula to obtain the edge distance map;
vij=minimum(vi-1,j-1+d1,vi-1,j+d2,vi-1,j+1+d1,vi,j-1+d2,vij) Formula one
vij=minimum(vi+1,j-1+d1,vi+1,j+d2,vi+1,j+1+d1,vi,j+1+d2,vij) Formula two
Wherein i represents the row of the edge map, j represents the column of the edge map, vijThe method comprises the steps of representing values of pixel points in the ith row and the jth column, wherein in a formula I, the value range of i is gradually increased from 2 to H, the value range of j is gradually increased from 2 to L, in a formula II, the value range of i is gradually decreased from H-1 to 1, the value range of j is gradually decreased from L-1 to 1, H represents the row number of the edge map, L represents the column number of the edge map, d1 and d2 represent preset values, for example, d1 is equal to 2, d2 is equal to 1, or d1 is equal to 4, d2 is equal to 3, and the like.
For the above formula one, it can be understood that the pixel processing is performed on the edge map in the forward direction, which is from left to right (i.e. the direction in which j gradually increases) and from top to bottom (i.e. the direction in which i gradually increases). For the above formula two, it can be understood that the pixel processing is performed on the edge map in the reverse direction, i.e. from right to left (i.e. the direction in which j gradually decreases) and from bottom to top (i.e. the direction in which i gradually decreases). In a specific implementation, the pixel processing may be performed on the edge map based on the above formula, and then further pixel processing may be performed based on the above formula two.
For example, referring to fig. 2a, fig. 2b, fig. 2c and fig. 2d, fig. 2a shows an edge map corresponding to an original, in which 0 indicates an edge pixel and 200 indicates a background of a non-edge pixel. FIG. 2b shows the result (only a portion of the result shown) of the processing of the values of the pixels in the edge map shown in FIG. 2a using the above formula one (e.g., d1 equals 2 and d2 equals 1). Comparing fig. 2a and fig. 2b, it can be seen that: 1) for the pixel point in the 2 nd row and the 2 nd column in fig. 2b, based on the above formula one, since the value of the upper left pixel plus 2 equals 202, the value of the upper pixel plus 1 equals 201, the value of the upper right pixel plus 2 equals 202, the value of the left pixel plus 1 equals 201, the original value of the pixel point is 200, and therefore the value of the pixel point is 200; 2) for the pixel point in the 2 nd row and the 5 th column in fig. 2b, based on the above formula one, since the value of the upper left pixel plus 2 equals 202, the value of the upper pixel plus 1 equals 201, the value of the upper right pixel plus 2 equals 202, the value of the left pixel plus 1 equals 1, the original value of the pixel point is 200, and the value of the pixel point is 1. Fig. 2c shows the result (only shown partial result) of processing all the processed images corresponding to fig. 2b by using the above formula two (for example, d1 is equal to 2, d2 is equal to 1), for example, fig. 2c shows a schematic diagram of performing operations from the pixel point of the 6 th row and the 6 th column to the pixel point of the 5 th row and the 2 nd column from right to left and from bottom to top, and the full graph operation is not performed. Fig. 2d shows the result of the whole graph operation based on the above first and second formulas.
Optionally, after a complete stitched image, such as a panoramic image, is obtained by using the image stitching method in the embodiment of the present application, the color tone of the image may also be adjusted. The specific adjustment process may be: firstly, the brightness b of the whole image is calculated, then the gamma value of the image is calculated by adopting the following formula III, and the gamma scaling is carried out on the image according to the gamma value which is 0.8 times of the gamma value.
Wherein n represents the number of pixel points in the image, xiAnd (3) expressing the pixel value of the pixel point i, wherein the value range of i is 0 to n.
Further, after the gamma value of the image is corrected based on the above process, the gray level mean value r of the three channels can be calculatedmean、bmean、cmeanSum total gray scale mean totalmean(r _ mean + g _ mean + b _ mean)/3, and calculates a correction coefficient for each channel: rconf totalmeanThe term "/r _ mean, gconf ═ total _ mean/g _ mean, bconf ═ total _ mean/b _ mean. And then, revising three channels of each pixel: IInewr=IIoldr*rconf,IInewg=IIoldg*gconf,IInewb=IIoldbBconf, thereby boosting the overall tone of the image.
It should be noted that the image stitching method according to the embodiment of the present application may be used for generating a scene of a panoramic image, for example, a scene of a panoramic image, such as a Virtual Reality (VR) scene, displayed by an internet search engine.
The image stitching process in the embodiment of the present application is described below with reference to fig. 3.
Referring to fig. 3, the image stitching process in the embodiment of the present application may include the following steps:
the method comprises the following steps of 31, automatically capturing original images (pictures), for example, displaying spliced webpage images on a search result page, searching for images with target website suffixes of 'jpg' and 'png' by using an extensible markup language (xml) module in python version 2.7, recording and downloading, directly customizing a target field under the condition of detailed position information of image materials in certain known webpage formats to improve capturing efficiency, and uniformly downloading images in batches after capturing is finished, wherein the images are stored locally and named as 1.jpg, 2.jpg and the like.
Step 32: determining the spatial rotation relationship of the images, and processing the corresponding images to be stitched (such as the original images 1 and 2, corresponding to the third image and the fourth image) by using the spatial rotation relationship to obtain the images to be stitched (such as the images 3 and 4 to be stitched, corresponding to the first image and the second image; further such as the left image and the middle image to be stitched, or the middle image and the right image to be stitched) in the same plane.
Optionally, since a Scale-invariant feature transform (SIFT) operator has invariant characteristics such as Scale, rotation, and brightness, the SIFT operator may be selected in this step to collect the feature points of the image, and the feature points of the image are paired according to a SIFT feature descriptor. For example, one image is divided into 4 regions a, b, c and d, which respectively represent upper left, upper right, lower left and lower right, and the most suitable matching region, i.e. the spatial three-dimensional rotation matching relationship between the two images, is found by matching the SIFT feature points in the region with the feature points of the 4 regions of the other images. The pairing relation, the serial number of the responding characteristic point and the descriptor can be recorded by the notebook, so that the subsequent calculation is facilitated.
A large number of SIFT feature points are available according to the above procedure, but the feature points must be optimized in consideration of errors that may occur. Here the RANSAC algorithm and the least squares method can be used to estimate the appropriate rotation matrix. The equation for the underlying rotation matrix can be expressed as follows:
for the above equation, if there is a deterministic (non-zero) solution, the rank of the coefficient matrix a is at most 8, since the coefficient matrix is a homogeneous matrix, if the rank of the coefficient matrix a is 8, the solution is unique with a difference of one scale factor, but if there is noise in the pixel coordinates, the rank of the matrix a may be greater than 8 (i.e. equal to 9, since a is a matrix of n × 9), at which time the least squares solution may be solved using Singular Value Decomposition (SVD), the solution vector f being the singular vector corresponding to the smallest singular value of the coefficient matrix a, i.e. the coefficient matrix a after the singular value decomposition takes the last column vector of the matrix V in udf, which is the solution vector f in the constraint of Af/.
Step 33: and determining the space translation relationship of the obtained images to be spliced on the same plane, and splicing the images by using the space translation relationship to obtain the panoramic image.
Note that, the manner of obtaining the spatial translation relationship in step 33 can be referred to in the above embodiments, and is not described herein again.
Step 34: and carrying out tone adjustment on the obtained panoramic image. The specific adjustment process can be referred to the above embodiments, and is not described herein again.
Therefore, through the image splicing process in the embodiment, under the condition of automatically ensuring correct logic and correct image output, the panoramic image with excellent effect can be obtained, the manual labor force is effectively reduced, and the real feeling of watching the panoramic image by a user is improved.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an image stitching apparatus according to an embodiment of the present application, and as shown in fig. 4, the image stitching apparatus 40 includes:
an obtaining module 41, configured to obtain a first image and a second image to be stitched;
an extracting module 42, configured to extract first edge information from the first image and extract second edge information from the second image;
a first processing module 43, configured to generate an edge density map of the first image according to the first edge information, and select a target edge area from the edge density map;
a generating module 44, configured to generate an edge distance map of the second image according to the second edge information;
a second processing module 45, configured to match the target edge region with the edge distance map, and select a target matching region from the edge distance map; the target matching area is an area with the smallest sum of values of pixel points corresponding to the target edge area in the edge distance graph;
a stitching module 46, configured to stitch the first image and the second image according to a position relationship between the target edge region and the target matching region.
Optionally, the first processing module 43 includes:
a first generating unit configured to generate an edge binarized image of the first image based on the first edge information; in the edge binary image, the value of an edge pixel point is 1, and the value of a non-edge pixel point is 0;
the statistical unit is used for respectively counting the number of edge pixel points around each edge pixel point in the edge binary image by utilizing a preset sliding window;
the first processing unit is used for taking the number of the edge pixel points around each edge pixel point as the edge density value of each edge pixel point, and replacing the value of each edge pixel point with the corresponding edge density value to obtain the edge density graph.
Optionally, the statistical unit is specifically configured to: respectively aiming at each edge pixel point, executing the following processes:
respectively covering the edge pixel points by using preset sliding windows with different covering areas by taking the edge pixel points as centers, and calculating the density value of the edge pixel points in each sliding window;
and determining the maximum value in the density values of the edge pixels obtained by calculation as the number of the edge pixels around the edge pixels.
Optionally, the first processing module 43 includes:
a first determining unit, configured to determine a maximum edge density value in the edge density map;
and the second determining unit is used for determining the coverage area corresponding to the sliding window when the maximum edge density value is obtained through calculation as the target edge area.
Optionally, in the edge distance map, the value of each pixel point is the closest distance value between the pixel point and the edge pixel point.
Optionally, the generating module 44 includes:
a second generating unit, configured to generate an edge map of the second image according to the second edge information; in the edge graph, the value of an edge pixel point is 0, and the value of a non-edge pixel point is a preset value larger than 0;
the second processing unit is used for processing the values of the pixel points in the edge map in sequence respectively by using a first formula and a second formula to obtain the edge distance map;
vij=minimum(vi-1,j-1+d1,vi-1,j+d2,vi-1,j+1+d1,vi,j-1+d2,vij) Formula one
vij=minimum(vi+1,j-1+d1,vi+1,j+d2,vi+1,j+1+d1,vi,j+1+d2,vij) Formula two
Wherein i represents the row of the edge map, j represents the column of the edge map, vijThe method comprises the steps of representing values of pixel points in the ith row and the jth column, wherein in a formula I, the value range of i is gradually increased from 2 to H, the value range of j is gradually increased from 2 to L, in a formula II, the value range of i is gradually decreased from H-1 to 1, the value range of j is gradually decreased from L-1 to 1, H represents the row number of an edge graph, L represents the column number of the edge graph, and d1 and d2 represent preset values.
Optionally, the obtaining module 41 includes:
the matching unit is used for matching the feature points of a third image and a fourth image to obtain a spatial rotation relation between the third image and the fourth image;
and the third processing unit is used for processing the third image and the fourth image by utilizing the spatial rotation relationship to obtain the first image and the second image to be spliced which are positioned on the same plane.
It can be understood that the image stitching device 40 according to the embodiment of the present application can implement the processes implemented in the method embodiment shown in fig. 1 and achieve the same beneficial effects, and for avoiding repetition, the details are not repeated here.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 5 is a block diagram of an electronic device for implementing the image stitching method according to the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 5, the electronic apparatus includes: one or more processors 501, memory 502, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 5, one processor 501 is taken as an example.
The memory 502, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the acquisition module 41, the extraction module 42, the first processing module 43, the generation module 44, the second processing module 45, and the stitching module 46 shown in fig. 4) corresponding to the image stitching method in the embodiment of the present application. The processor 501 executes various functional applications of the server and data processing by running non-transitory software programs, instructions and modules stored in the memory 502, that is, implements the image stitching method in the above method embodiment.
The memory 502 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by use of the electronic device, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 502 optionally includes memory located remotely from processor 501, which may be connected to an electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the image stitching method may further include: an input device 503 and an output device 504. The processor 501, the memory 502, the input device 503 and the output device 504 may be connected by a bus or other means, and fig. 5 illustrates the connection by a bus as an example.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function controls of the electronic device of the image stitching method, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, etc. the output device 504 may include a display device, an auxiliary lighting device (e.g., L ED), a tactile feedback device (e.g., a vibrating motor), etc.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable logic devices (P L D)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal.
The systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or L CD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer for providing interaction with the user.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., AN application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with AN implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, image splicing can be achieved by means of the edge matching process, so that compared with the existing image splicing method, poor image splicing effect caused by the fact that noise exists in the point coordinates of the image feature points and the like can be avoided, the image splicing effect is effectively improved, and the image effect after splicing is better.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (16)
1. An image stitching method, comprising:
acquiring a first image and a second image to be spliced;
extracting first edge information from the first image and second edge information from the second image;
generating an edge density map of the first image according to the first edge information, and selecting a target edge area from the edge density map;
generating an edge distance map of the second image according to the second edge information;
matching the target edge area with the edge distance map, and selecting a target matching area from the edge distance map; the target matching area is an area with the smallest sum of values of pixel points corresponding to the target edge area in the edge distance graph;
and splicing the first image and the second image according to the position relation between the target edge area and the target matching area.
2. The method of claim 1, wherein generating an edge density map of the first image from the first edge information comprises:
generating an edge binarization image of the first image according to the first edge information; in the edge binary image, the value of an edge pixel point is 1, and the value of a non-edge pixel point is 0;
respectively counting the number of edge pixel points around each edge pixel point in the edge binary image by using a preset sliding window;
and taking the number of the edge pixel points around each edge pixel point as the edge density value of each edge pixel point, and replacing the value of each edge pixel point with the corresponding edge density value to obtain the edge density graph.
3. The method according to claim 2, wherein the separately counting the number of edge pixels around each edge pixel in the edge binarized image by using a preset sliding window comprises:
respectively aiming at each edge pixel point, executing the following processes:
respectively covering the edge pixel points by using preset sliding windows with different covering areas by taking the edge pixel points as centers, and calculating the density value of the edge pixel points in each sliding window;
and determining the maximum value in the density values of the edge pixels obtained by calculation as the number of the edge pixels around the edge pixels.
4. The method of claim 3, wherein the selecting a target edge region from the edge density map comprises:
determining a maximum edge density value in the edge density map;
and determining the coverage area corresponding to the sliding window when the maximum edge density value is obtained through calculation as the target edge area.
5. The method of claim 1, wherein in the edge distance map, the value of each pixel point is the closest distance value of the pixel point from an edge pixel point.
6. The method of claim 1, wherein generating the edge distance map of the second image according to the second edge information comprises:
generating an edge map of the second image according to the second edge information; in the edge graph, the value of an edge pixel point is 0, and the value of a non-edge pixel point is a preset value larger than 0;
respectively processing the values of the pixel points in the edge map in sequence by using a first formula and a second formula to obtain the edge distance map;
vij=minimum(vi-1,j-1+d1,vi-1,j+d2,vi-1,j+1+d1,vi,j-1+d2,vij) Formula one
vij=minimum(vi+1,j-1+d1,vi+1,j+d2,vi+1,j+1+d1,vi,j+1+d2,vij) Formula two
Wherein i represents the row of the edge map, j represents the column of the edge map, vijThe method comprises the steps of representing values of pixel points in the ith row and the jth column, wherein in a formula I, the value range of i is gradually increased from 2 to H, the value range of j is gradually increased from 2 to L, in a formula II, the value range of i is gradually decreased from H-1 to 1, the value range of j is gradually decreased from L-1 to 1, H represents the row number of an edge graph, L represents the column number of the edge graph, and d1 and d2 represent preset values.
7. The method of claim 1, wherein the acquiring the first image and the second image to be stitched comprises:
performing feature point matching on a third image and a fourth image to obtain a spatial rotation relationship between the third image and the fourth image;
and processing the third image and the fourth image by utilizing the spatial rotation relationship to obtain the first image and the second image to be spliced which are positioned on the same plane.
8. An image stitching device, comprising:
the acquisition module is used for acquiring a first image and a second image to be spliced;
an extraction module, configured to extract first edge information from the first image and extract second edge information from the second image;
the first processing module is used for generating an edge density map of the first image according to the first edge information and selecting a target edge area from the edge density map;
the generating module is used for generating an edge distance map of the second image according to the second edge information;
the second processing module is used for matching the target edge area with the edge distance map and selecting a target matching area from the edge distance map; the target matching area is an area with the smallest sum of values of pixel points corresponding to the target edge area in the edge distance graph;
and the splicing module is used for splicing the first image and the second image according to the position relation between the target edge area and the target matching area.
9. The apparatus of claim 8, wherein the first processing module comprises:
a first generating unit configured to generate an edge binarized image of the first image based on the first edge information; in the edge binary image, the value of an edge pixel point is 1, and the value of a non-edge pixel point is 0;
the statistical unit is used for respectively counting the number of edge pixel points around each edge pixel point in the edge binary image by utilizing a preset sliding window;
the first processing unit is used for taking the number of the edge pixel points around each edge pixel point as the edge density value of each edge pixel point, and replacing the value of each edge pixel point with the corresponding edge density value to obtain the edge density graph.
10. The apparatus according to claim 9, wherein the statistical unit is specifically configured to: respectively aiming at each edge pixel point, executing the following processes:
respectively covering the edge pixel points by using preset sliding windows with different covering areas by taking the edge pixel points as centers, and calculating the density value of the edge pixel points in each sliding window;
and determining the maximum value in the density values of the edge pixels obtained by calculation as the number of the edge pixels around the edge pixels.
11. The apparatus of claim 10, wherein the first processing module comprises:
a first determining unit, configured to determine a maximum edge density value in the edge density map;
and the second determining unit is used for determining the coverage area corresponding to the sliding window when the maximum edge density value is obtained through calculation as the target edge area.
12. The apparatus of claim 8, wherein in the edge distance map, the value of each pixel point is the closest distance value of the pixel point from an edge pixel point.
13. The apparatus of claim 8, wherein the generating module comprises:
a second generating unit, configured to generate an edge map of the second image according to the second edge information; in the edge graph, the value of an edge pixel point is 0, and the value of a non-edge pixel point is a preset value larger than 0;
the second processing unit is used for processing the values of the pixel points in the edge map in sequence respectively by using a first formula and a second formula to obtain the edge distance map;
vij=minimum(vi-1,j-1+d1,vi-1,j+d2,vi-1,j+1+d1,vi,j-1+d2,vij) Formula one
vij=minimum(vi+1,j-1+d1,vi+1,j+d2,vi+1,j+1+d1,vi,j+1+d2,vij) Formula two
Wherein i represents the row of the edge map, j represents the column of the edge map, vijThe method comprises the steps of representing values of pixel points in the ith row and the jth column, wherein in a formula I, the value range of i is gradually increased from 2 to H, the value range of j is gradually increased from 2 to L, in a formula II, the value range of i is gradually decreased from H-1 to 1, the value range of j is gradually decreased from L-1 to 1, H represents the row number of an edge graph, L represents the column number of the edge graph, and d1 and d2 represent preset values.
14. The apparatus of claim 8, wherein the obtaining module comprises:
the matching unit is used for matching the feature points of a third image and a fourth image to obtain a spatial rotation relation between the third image and the fourth image;
and the third processing unit is used for processing the third image and the fourth image by utilizing the spatial rotation relationship to obtain the first image and the second image to be spliced which are positioned on the same plane.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010201387.2A CN111415298B (en) | 2020-03-20 | 2020-03-20 | Image stitching method and device, electronic equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010201387.2A CN111415298B (en) | 2020-03-20 | 2020-03-20 | Image stitching method and device, electronic equipment and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111415298A true CN111415298A (en) | 2020-07-14 |
CN111415298B CN111415298B (en) | 2023-06-02 |
Family
ID=71491289
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010201387.2A Active CN111415298B (en) | 2020-03-20 | 2020-03-20 | Image stitching method and device, electronic equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111415298B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112651983A (en) * | 2020-12-15 | 2021-04-13 | 北京百度网讯科技有限公司 | Mosaic image identification method and device, electronic equipment and storage medium |
CN113744401A (en) * | 2021-09-09 | 2021-12-03 | 网易(杭州)网络有限公司 | Terrain splicing method and device, electronic equipment and storage medium |
CN114004744A (en) * | 2021-10-15 | 2022-02-01 | 深圳市亚略特生物识别科技有限公司 | Fingerprint splicing method and device, electronic equipment and medium |
CN114066732A (en) * | 2021-11-21 | 2022-02-18 | 特斯联科技集团有限公司 | Visible light image geometric radiation splicing processing method of multi-source monitor |
CN117173161A (en) * | 2023-10-30 | 2023-12-05 | 杭州海康威视数字技术股份有限公司 | Content security detection method, device, equipment and system |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080074441A1 (en) * | 2006-09-27 | 2008-03-27 | Fujitsu Limited | Image processing apparatus, image processing method, image processing program, and image pickup apparatus |
US20080266408A1 (en) * | 2007-04-26 | 2008-10-30 | Core Logic, Inc. | Apparatus and method for generating panorama image and computer readable medium stored thereon computer executable instructions for performing the method |
US20120307000A1 (en) * | 2011-06-01 | 2012-12-06 | Apple Inc. | Image Registration Using Sliding Registration Windows |
US20130229548A1 (en) * | 2011-06-24 | 2013-09-05 | Rakuten, Inc. | Image providing device, image processing method, image processing program, and recording medium |
US20160255281A1 (en) * | 2015-02-27 | 2016-09-01 | Brother Kogyo Kabushiki Kaisha | Image processing device generating arranged image data representing arranged image in which images are arranged according to determined relative position |
US20170148222A1 (en) * | 2014-10-31 | 2017-05-25 | Fyusion, Inc. | Real-time mobile device capture and generation of art-styled ar/vr content |
CN108109108A (en) * | 2016-11-25 | 2018-06-01 | 北京视联动力国际信息技术有限公司 | A kind of image split-joint method and device based on cosine similarity adaptive algorithm |
CN110309787A (en) * | 2019-07-03 | 2019-10-08 | 电子科技大学 | A kind of human body sitting posture detection method based on depth camera |
CN110738599A (en) * | 2019-10-14 | 2020-01-31 | 北京百度网讯科技有限公司 | Image splicing method and device, electronic equipment and storage medium |
-
2020
- 2020-03-20 CN CN202010201387.2A patent/CN111415298B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080074441A1 (en) * | 2006-09-27 | 2008-03-27 | Fujitsu Limited | Image processing apparatus, image processing method, image processing program, and image pickup apparatus |
US20080266408A1 (en) * | 2007-04-26 | 2008-10-30 | Core Logic, Inc. | Apparatus and method for generating panorama image and computer readable medium stored thereon computer executable instructions for performing the method |
US20120307000A1 (en) * | 2011-06-01 | 2012-12-06 | Apple Inc. | Image Registration Using Sliding Registration Windows |
US20130229548A1 (en) * | 2011-06-24 | 2013-09-05 | Rakuten, Inc. | Image providing device, image processing method, image processing program, and recording medium |
US20170148222A1 (en) * | 2014-10-31 | 2017-05-25 | Fyusion, Inc. | Real-time mobile device capture and generation of art-styled ar/vr content |
US20160255281A1 (en) * | 2015-02-27 | 2016-09-01 | Brother Kogyo Kabushiki Kaisha | Image processing device generating arranged image data representing arranged image in which images are arranged according to determined relative position |
CN108109108A (en) * | 2016-11-25 | 2018-06-01 | 北京视联动力国际信息技术有限公司 | A kind of image split-joint method and device based on cosine similarity adaptive algorithm |
CN110309787A (en) * | 2019-07-03 | 2019-10-08 | 电子科技大学 | A kind of human body sitting posture detection method based on depth camera |
CN110738599A (en) * | 2019-10-14 | 2020-01-31 | 北京百度网讯科技有限公司 | Image splicing method and device, electronic equipment and storage medium |
Non-Patent Citations (5)
Title |
---|
S.BATTIATO ET AL.: ""Digital Mosaic Frameworks-An Overview"", 《COMPUTER GRAPHICS FORUM》 * |
YONG CHEN ET AL.: ""An improved image mosaic based on Canny edge and an 18-dimensional descriptor"", 《OPTIK》 * |
姜丽凤 等: ""基于局部边缘密度和局部熵的图像拼接算法"", 《山东理工大学学报(自然科学版)》 * |
雷桐 等: ""拍摄全景建筑图像自动拼接仿真"", 《计算机仿真》 * |
颜振翔 等: ""基于区域蛙跳搜索与轮廓匹配的显微图像拼接"", 《激光与光电子学进展》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112651983A (en) * | 2020-12-15 | 2021-04-13 | 北京百度网讯科技有限公司 | Mosaic image identification method and device, electronic equipment and storage medium |
CN112651983B (en) * | 2020-12-15 | 2023-08-01 | 北京百度网讯科技有限公司 | Splice graph identification method and device, electronic equipment and storage medium |
CN113744401A (en) * | 2021-09-09 | 2021-12-03 | 网易(杭州)网络有限公司 | Terrain splicing method and device, electronic equipment and storage medium |
CN114004744A (en) * | 2021-10-15 | 2022-02-01 | 深圳市亚略特生物识别科技有限公司 | Fingerprint splicing method and device, electronic equipment and medium |
CN114066732A (en) * | 2021-11-21 | 2022-02-18 | 特斯联科技集团有限公司 | Visible light image geometric radiation splicing processing method of multi-source monitor |
CN114066732B (en) * | 2021-11-21 | 2022-05-24 | 特斯联科技集团有限公司 | Visible light image geometric radiation splicing processing method of multi-source monitor |
CN117173161A (en) * | 2023-10-30 | 2023-12-05 | 杭州海康威视数字技术股份有限公司 | Content security detection method, device, equipment and system |
CN117173161B (en) * | 2023-10-30 | 2024-02-23 | 杭州海康威视数字技术股份有限公司 | Content security detection method, device, equipment and system |
Also Published As
Publication number | Publication date |
---|---|
CN111415298B (en) | 2023-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111415298A (en) | Image splicing method and device, electronic equipment and computer readable storage medium | |
US10547871B2 (en) | Edge-aware spatio-temporal filtering and optical flow estimation in real time | |
CN110555795A (en) | High resolution style migration | |
CN111739005B (en) | Image detection method, device, electronic equipment and storage medium | |
CN110503725A (en) | Method, apparatus, electronic equipment and the computer readable storage medium of image procossing | |
CN111753961A (en) | Model training method and device, and prediction method and device | |
US9734599B2 (en) | Cross-level image blending | |
CN111225236B (en) | Method and device for generating video cover, electronic equipment and computer-readable storage medium | |
CN111507806A (en) | Virtual shoe fitting method, device, equipment and storage medium | |
JP7213291B2 (en) | Method and apparatus for generating images | |
US11641446B2 (en) | Method for video frame interpolation, and electronic device | |
JP2019520662A (en) | Content-based search and retrieval of trademark images | |
KR102432561B1 (en) | Edge-based three-dimensional tracking and registration method and apparatus for augmented reality, and electronic device | |
US11568631B2 (en) | Method, system, and non-transitory computer readable record medium for extracting and providing text color and background color in image | |
KR20220153667A (en) | Feature extraction methods, devices, electronic devices, storage media and computer programs | |
CN111209909B (en) | Construction method, device, equipment and storage medium for qualification recognition template | |
JP2022536320A (en) | Object identification method and device, electronic device and storage medium | |
US20200279355A1 (en) | Previewing a content-aware fill | |
US20240161240A1 (en) | Harmonizing composite images utilizing a semantic-guided transformer neural network | |
CN111079059A (en) | Page checking method, device, equipment and computer readable storage medium | |
JP7269979B2 (en) | Method and apparatus, electronic device, computer readable storage medium and computer program for detecting pedestrians | |
CN112541934B (en) | Image processing method and device | |
CN112381877A (en) | Positioning fusion and indoor positioning method, device, equipment and medium | |
CN111385489B (en) | Method, device and equipment for manufacturing short video cover and storage medium | |
US20190287225A1 (en) | Patch validity test |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |