CN113163111A - Panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, storage medium and terminal - Google Patents
Panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, storage medium and terminal Download PDFInfo
- Publication number
- CN113163111A CN113163111A CN202110319616.5A CN202110319616A CN113163111A CN 113163111 A CN113163111 A CN 113163111A CN 202110319616 A CN202110319616 A CN 202110319616A CN 113163111 A CN113163111 A CN 113163111A
- Authority
- CN
- China
- Prior art keywords
- image
- weighting
- stitching
- width
- gaussian
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Abstract
The invention discloses a panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, a storage medium and a terminal, and relates to the technical field of image processing, wherein the stitching method comprises the following steps: acquiring two vertically aligned images by a camera with an overlapped view field area, wherein the width of the overlapped area of the left image and the right image is omega; taking a certain input image as a reference standard, and adjusting the rest images to be synchronous with the reference image; the method is improved aiming at the problems of obvious and excessive unnatural abutted seams, particularly ghost shadows, when images are sewn in main scenes (from near to far) applied to panoramic monitoring at present, so that the panoramic effect of the scenes is further improved. The method can adaptively calculate the distance from the target to the camera in the overlapping area in the scene, and has good adaptivity in practical application.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, a storage medium and a terminal.
Background
The image stitching technology plays an important role in panoramic image splicing. The so-called image stitching is that after the images are spliced and aligned, due to the difference of angles, exposures, optical centers and the like of the two images, the spliced images still have obvious boundary traces, and in order to eliminate the boundary, an image stitching technology is used.
Generally, according to the practical application procedure, the image stitching technique is used after image stitching, and image stitching can already ensure that the positions (including the x direction and the y direction) of two images are aligned, and certainly, the image alignment can also be ensured through the positions of two cameras. And the scene of the image is complex, so that the image stitching is difficult. At present, there are many image stitching algorithms, including direct method, multi-band method, etc., for example: the university of the great courseware adopts a multiband stitching algorithm, divides the image into low, medium and high frequencies in a pyramid mode, and then completes image stitching by adopting a linear weighting algorithm. Although the method solves the problems of different angles and exposures and the image blurring problem of the stitched region to a certain extent under the condition that the actual image has registration errors, the method still has no method for the ghost problem existing in the stitched image; therefore, how to eliminate the ghost in the stitched image to improve the panoramic visual effect in the scene is a problem to be solved at present.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, provides a panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, a storage medium and a terminal, and solves the defects of the existing stitching method.
The purpose of the invention is realized by the following technical scheme: a method for stitching panoramic images based on gaussian weighting or sinusoidal weighting, the method comprising the following steps:
taking a certain input image as a reference standard, and adjusting the rest images to be synchronous with the reference image;
extracting feature points of the adjusted image overlapping area;
geometrically transforming the width of the image overlapping area;
and obtaining a final stitching image through a weighting algorithm after transformation.
Before the input image is adjusted, the image to be stitched needs to be acquired, and the width ω of the image overlapping region is determined.
The step of adjusting the rest image to be synchronous with the standard image by taking a certain input image as a standard comprises the following steps:
determining a certain image as a reference image;
and adjusting the brightness of the residual image to be consistent with the brightness of the reference image by taking the brightness of the reference image as a reference standard.
The geometrically transforming the width of the image overlapping region comprises the following:
calculating the distance from the scenery in the overlapping area to the camera through the characteristic points in the overlapping area in the images;
arranging according to the distance from near to far to obtain the widths omega of different lines of the image overlapping regions corresponding to different distancesi;
Image overlap region width omega according to different linesiThe image is geometrically transformed.
And obtaining a final stitched image by the transformed image through a weighting algorithm comprises obtaining the final stitched image through a Gaussian weighting algorithm or a sinusoidal weighting algorithm.
The obtaining of the final stitching image by the sine weighting algorithm includes the following steps:
inputting the image into a sine function model;
calculating the width omega of the overlapping region of different lines according to the width omega of the overlapping region of the imagesiTo obtain the width omega of the corresponding row overlapping regioniThe stitched image of (1);
and synthesizing the stitched images of all the rows to obtain a final stitched image.
The obtaining of the final stitched image by the gaussian weighting algorithm includes the following steps:
inputting the image into a Gaussian function model;
calculating the width omega of the overlapping region of different lines according to the width omega of the overlapping region of the imagesiIs a gaussian weighted sutureNumber of the lines to obtain the width omega of the corresponding line overlapping regioniThe stitched image of (1);
and synthesizing the stitched images of all the rows to obtain a final stitched image.
The extracting of the adjusted feature points of the image overlapping region comprises extracting the feature points in the overlapping region through an SIFT image feature extraction algorithm; firstly, calculating a DoG image of the image, then detecting a local extreme value in the DoG image, and finally screening out a stable extreme value by using a method of maximum characteristics of a Hessian matrix
A storage medium having stored therein computer program instructions which, when executed, perform the steps of the method for stitching a panoramic image based on gaussian weighting or sinusoidal weighting.
A terminal comprising a memory and a processor, the memory having stored thereon computer program instructions executable on the processor, when executing the computer program instructions, performing the steps of the method for stitching panoramic images based on gaussian or sinusoidal weighting.
The invention has the beneficial effects that: the method is used for improving the problems of obvious and excessively unnatural abutted seams, particularly ghost shadows, when images are stitched in main scenes (from near to far) of the current panoramic monitoring application, so that the panoramic effect of the scenes is further improved. The method can adaptively calculate the distance from the target to the camera in the overlapping area in the scene, and has good adaptivity in practical application.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a geometric schematic of binocular ranging;
FIG. 3a is a left image to be stitched collected by a camera;
FIG. 3b is a right image to be stitched captured by the camera;
FIG. 4a is a transformed left image to be stitched;
FIG. 4b is the transformed right image to be stitched;
FIG. 5 is an image after stitching by the method of the present invention;
FIG. 6 is a graph of a stitched image according to a prior art method;
FIG. 7a is an enlarged image of the image detail after stitching by the prior art method;
FIG. 7b is an enlarged image of the image detail after stitching according to the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present invention, it should be noted that the terms "upper", "inner", "outer", etc. indicate orientations or positional relationships based on those shown in the drawings or orientations or positional relationships that the products of the present invention conventionally use, which are merely for convenience of description and simplification of description, but do not indicate or imply that the devices or elements referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
In the description of the present invention, it should also be noted that, unless otherwise explicitly specified or limited, the terms "disposed," "mounted," and "connected" are to be construed broadly, e.g., as meaning fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The technical solutions of the present invention are further described in detail below with reference to the accompanying drawings, but the scope of the present invention is not limited to the following.
The inventor unexpectedly finds that fine errors exist in the final registration stitching due to different shooting angles and different exposure times of the two images and more importantly different imaging models of the images, and actually the main reason of ghost in the real multi-camera stitching is caused by the change of the distance from a target object to a camera in the overlapping area of the two cameras; therefore, the stitching method of the present invention can be effectively used to find the cause of the ghost in the image stitching unexpectedly.
The term "ghost" in the present application refers to a ghost phenomenon in which some target images appear in a stitched region due to misalignment of partial regions.
As shown in fig. 1, a panorama image stitching method based on gaussian weighting or sinusoidal weighting includes the following:
s1, acquiring two vertically aligned images through a camera with an overlapped field area, wherein the width of the overlapped area of the left image and the right image is omega;
s2, taking one image as a reference standard, adjusting the rest images to be synchronous with the reference image;
s3, extracting the feature points in the overlapping region by an SIFT (Scale invariant feature transform) image feature extraction algorithm;
s4, geometrically transforming the width of the image overlapping area;
and S5, obtaining a final stitching image through the transformed image by a weighting algorithm.
Further, said adjusting the remaining image to be synchronized with the standard image based on the input certain image comprises:
s21, determining one image of the left image and the right image as a reference image;
and S22, adjusting the brightness of the other image to be consistent with the brightness of the reference image by taking the brightness of the reference image as a reference standard.
Further, as shown in fig. 2, geometrically transforming the width of the image overlap region includes the following: the geometric relationship of the graph can calculate the distance di from the point P to the camera in the overlapping area of the two images, the distance determines the difference of the widths of the overlapping area of the two images, the farther the overlapping area is, the smaller the overlapping area is, the nearer the overlapping area is, the farther the overlapping area is, the larger the overlapping area is (the main reason of ghost is that the nearer the overlapping area can be aligned, the farther the overlapping area can not be aligned), the overlapping area can be geometrically post-transformed according to the linear transformation from the near to the far, and the changing effect is that the rectangle of the original image is changed into a right trapezoid.
S41, calculating the distance from the scenery in the overlapping area to the camera through the characteristic points in the overlapping area in the images;
s42, arranging the images according to the distance from near to far to obtain the widths omega of different lines of the image overlapping regions corresponding to different distancesi;
If the stitching overlaps the nearest distance d1The farthest distance d2Nearest overlap width ω1Farthest overlapping region ω2Then ω isiAnd diLinear relation, satisfies omegai=(ω1-ω2)*di/(d2-d1);
S43, image overlap region width omega according to different linesiThe image is geometrically transformed.
Further, obtaining a final stitched image from the transformed image through a weighting algorithm includes obtaining the final stitched image through a gaussian weighting algorithm or a sinusoidal weighting algorithm.
The obtaining of the final stitching image by the sine weighting algorithm includes the following steps:
a1, inputting the image into a sine function model;
a2, calculating the overlap region width omega of different lines according to the image overlap region width omegaiTo obtain the width omega of the corresponding row overlapping regioniThe stitched image of (1);
and A3, synthesizing the stitched images of all the rows to obtain a final stitched image.
Further, the sine function model is: (x) Sin (x);
the resulting width ω of the overlapping regions of different rowsiThe sine weighted stitching function of (a) is:
f(x)=f1(x)·(1-S(ωi))+f2(x)·S(ωi),(0<S<π/(2ωi))
wherein f is1(x) Representing the left image, f2(x) Representing the right image, ωiFor the width of the ith row of the overlap region, S is a sine function.
The obtaining of the final stitched image by the gaussian weighting algorithm includes the following steps:
b1, inputting the image into a Gaussian function model;
b2, calculating the overlap region width omega of different lines according to the image overlap region width omegaiTo obtain the width omega of the corresponding row overlapping regioniThe stitched image of (1);
and B3, synthesizing the stitched images of all the rows to obtain a final stitched image.
the resulting width ω of the overlapping regions of different rowsiThe gaussian weighted stitching function of (a) is:
f(x)=f1(x)·(1-G/ωi)+f2(x)·G/ωi,(0<G<ωi)
wherein f is1(x) Representing the left image, f2(x) Representing the right image, ωiFor the width of the ith row of the overlap region, G is a normalized Gaussian function.
And extracting the adjusted feature points of the image overlapping region comprises extracting the feature points in the overlapping region by using an SIFT image feature extraction algorithm.
Specifically, a DoG image of the image is calculated, then a local extreme value is detected in the DoG image, and finally a stable extreme value is screened out through a method of maximum characteristics of a Hessian matrix.
3 a-7 b, FIGS. 3a and 3b are two original left and right rectangular images; FIGS. 4a and 4b are transformed right angle trapezoidal images; FIG. 5 is a diagram of an image stitched according to the method of the present invention, in which a stitched region is from near to far, and the stitched region is excessively smooth and natural and has no ghost. FIG. 6 illustrates an image obtained using a prior art weighted stitching method with stitching region blurring and ghosting; fig. 7a and 7b are a comparison of the suture regions enlarged from fig. 5 and 6, and it is apparent that the suture region of fig. 7a (i.e., fig. 6) is heavily ghosted.
Yet another embodiment of the present invention is a storage medium having stored therein computer program instructions which, when executed, perform the steps of the panoramic image stitching method based on gaussian weighting or sinusoidal weighting.
Yet another embodiment of the present invention is a terminal comprising a memory and a processor, the memory having stored thereon computer program instructions executable on the processor, the processor executing the steps of the method for stitching panoramic images based on gaussian or sinusoidal weighting when executing the computer program instructions.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (5)
1. The panoramic image stitching method based on Gaussian weighting or sine weighting is characterized by comprising the following steps: the sewing method comprises the following steps:
acquiring an image to be stitched, and determining the width omega of an image overlapping region;
taking a certain input image as a reference standard, and adjusting the rest images to be synchronous with the reference image;
extracting feature points of the adjusted image overlapping area;
geometrically transforming the width of the image overlapping area;
obtaining a final stitching image through a weighting algorithm after transformation;
the geometrically transforming the width of the image overlapping region comprises the following:
calculating the distance from the scenery in the overlapping area to the camera through the characteristic points in the overlapping area in the images;
arranging according to the distance from near to far to obtain the widths omega of different lines of the image overlapping regions corresponding to different distancesi;
If the stitching overlaps the nearest distance d1The farthest distance d2Nearest overlap width ω1Farthest overlapping region ω2Then ω isiAnd diLinear relation, satisfies omegai=(ω1-ω2)*di/(d2-d1);diThe distance from the point P in the overlapped area of the two images to the camera;
image overlap region width omega according to different linesiCarrying out geometric transformation on the image;
obtaining a final stitched image by the transformed image through a weighting algorithm, wherein obtaining the final stitched image through a Gaussian weighting algorithm or a sinusoidal weighting algorithm;
the final stitched image obtained by the sine weighting algorithm includes the following:
inputting the image into a sine function model;
calculating the width omega of the overlapping region of different lines according to the width omega of the overlapping region of the imagesiTo obtain the width omega of the corresponding row overlapping regioniThe stitched image of (1);
synthesizing the stitching images of all the rows to obtain a final stitching image;
wherein, the sine function model is as follows: s (x) sin (x); resulting width ω of the overlapping region of the different rowsiThe sine weighted stitching function of (a) is:
f(x)=f1(x)·(1-S(ωi))+f2(x)·S(ωi),(0<S<π/(2ωi))
wherein f is1(x) Representing the left image, f2(x) Representing the right image, ωiThe width of the ith row of the overlapping area is S is a sine function;
obtaining the final stitched image by the gaussian weighting algorithm includes the following:
inputting the image into a Gaussian function model;
calculating the width omega of the overlapping region of different lines according to the width omega of the overlapping region of the imagesiTo obtain the width omega of the corresponding row overlapping regioniThe stitched image of (1);
synthesizing the stitching images of all the rows to obtain a final stitching image;
wherein, the Gaussian function model is as follows:resulting width ω of the overlapping region of the different rowsiThe gaussian weighted stitching function of (a) is:
f(x)=f1(x)·(1-G/ωi)+f2(x)·G/ωi,(0<G<ωi)
wherein f is1(x) Representing the left image, f2(x) Representing the right image, ωiFor the width of the ith row of the overlap region, G is a normalized Gaussian function.
2. The panorama image stitching method based on gaussian weighting or sinusoidal weighting according to claim 1, wherein: the step of adjusting the rest image to be synchronous with the standard image by taking a certain input image as a standard comprises the following steps:
determining a certain image as a reference image;
and adjusting the brightness of the residual image to be consistent with the brightness of the reference image by taking the brightness of the reference image as a reference standard.
3. The panorama image stitching method based on gaussian weighting or sinusoidal weighting according to claim 1, wherein: the extracting of the adjusted feature points of the image overlapping region comprises extracting the feature points in the overlapping region through an SIFT image feature extraction algorithm; the method comprises the steps of firstly calculating a DoG image of an image, then detecting a local extreme value in the DoG image, and finally screening out a stable extreme value through a method of maximum characteristics of a Hessian matrix.
4. A storage medium having computer program instructions stored therein, characterized in that: the computer program instructions when executed perform the steps of the method of gaussian or sinusoidal weighting based panoramic image stitching according to any one of claims 1 to 3.
5. A terminal comprising a memory and a processor, the memory having stored thereon computer program instructions executable on the processor, the terminal characterized by: the processor when executing the computer program instructions performs the steps of the method of panoramic image stitching based on gaussian weighting or sinusoidal weighting according to any one of claims 1 to 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110319616.5A CN113163111B (en) | 2019-10-15 | 2019-10-15 | Panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, storage medium and terminal |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910978720.8A CN110719405B (en) | 2019-10-15 | 2019-10-15 | Multi-camera panoramic image stitching method based on binocular ranging, storage medium and terminal |
CN202110319616.5A CN113163111B (en) | 2019-10-15 | 2019-10-15 | Panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, storage medium and terminal |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910978720.8A Division CN110719405B (en) | 2019-10-15 | 2019-10-15 | Multi-camera panoramic image stitching method based on binocular ranging, storage medium and terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113163111A true CN113163111A (en) | 2021-07-23 |
CN113163111B CN113163111B (en) | 2022-07-22 |
Family
ID=69212603
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110319616.5A Active CN113163111B (en) | 2019-10-15 | 2019-10-15 | Panoramic image stitching method based on Gaussian weighting or sinusoidal weighting, storage medium and terminal |
CN201910978720.8A Active CN110719405B (en) | 2019-10-15 | 2019-10-15 | Multi-camera panoramic image stitching method based on binocular ranging, storage medium and terminal |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910978720.8A Active CN110719405B (en) | 2019-10-15 | 2019-10-15 | Multi-camera panoramic image stitching method based on binocular ranging, storage medium and terminal |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN113163111B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006132029A1 (en) * | 2005-06-07 | 2006-12-14 | Matsushita Electric Industrial Co., Ltd. | Monitoring system, monitoring method, and camera terminal |
CN104182951A (en) * | 2014-08-15 | 2014-12-03 | 张建伟 | Multiband image stitching method based on panorama of dual cameras |
CN105488775A (en) * | 2014-10-09 | 2016-04-13 | 东北大学 | Six-camera around looking-based cylindrical panoramic generation device and method |
CN105657252A (en) * | 2015-12-25 | 2016-06-08 | 青岛海信移动通信技术股份有限公司 | Image processing method in mobile terminal and mobile terminal |
WO2018207402A1 (en) * | 2017-05-12 | 2018-11-15 | パナソニックIpマネジメント株式会社 | Image processing apparatus and image processing method |
CN110223226A (en) * | 2019-05-07 | 2019-09-10 | 中国农业大学 | Panorama Mosaic method and system |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101556692A (en) * | 2008-04-09 | 2009-10-14 | 西安盛泽电子有限公司 | Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points |
CN102402855A (en) * | 2011-08-29 | 2012-04-04 | 深圳市蓝盾科技有限公司 | Method and system of fusing real-time panoramic videos of double cameras for intelligent traffic |
CN105894451B (en) * | 2016-03-30 | 2019-03-08 | 沈阳泰科易科技有限公司 | Panorama Mosaic method and apparatus |
-
2019
- 2019-10-15 CN CN202110319616.5A patent/CN113163111B/en active Active
- 2019-10-15 CN CN201910978720.8A patent/CN110719405B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006132029A1 (en) * | 2005-06-07 | 2006-12-14 | Matsushita Electric Industrial Co., Ltd. | Monitoring system, monitoring method, and camera terminal |
CN104182951A (en) * | 2014-08-15 | 2014-12-03 | 张建伟 | Multiband image stitching method based on panorama of dual cameras |
CN105488775A (en) * | 2014-10-09 | 2016-04-13 | 东北大学 | Six-camera around looking-based cylindrical panoramic generation device and method |
CN105657252A (en) * | 2015-12-25 | 2016-06-08 | 青岛海信移动通信技术股份有限公司 | Image processing method in mobile terminal and mobile terminal |
WO2018207402A1 (en) * | 2017-05-12 | 2018-11-15 | パナソニックIpマネジメント株式会社 | Image processing apparatus and image processing method |
CN110223226A (en) * | 2019-05-07 | 2019-09-10 | 中国农业大学 | Panorama Mosaic method and system |
Non-Patent Citations (1)
Title |
---|
罗超等: "仿生复眼式全景探测及跟踪策略", 《中国图象图形学报》 * |
Also Published As
Publication number | Publication date |
---|---|
CN113163111B (en) | 2022-07-22 |
CN110719405A (en) | 2020-01-21 |
CN110719405B (en) | 2021-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Adel et al. | Image stitching based on feature extraction techniques: a survey | |
US9224189B2 (en) | Method and apparatus for combining panoramic image | |
US8554014B2 (en) | Robust fast panorama stitching in mobile phones or cameras | |
CN103226822B (en) | Medical imaging joining method | |
KR101175097B1 (en) | Panorama image generating method | |
WO2017076106A1 (en) | Method and device for image splicing | |
US20100033553A1 (en) | In-camera panorama image stitching assistance | |
JP4811462B2 (en) | Image processing method, image processing program, image processing apparatus, and imaging apparatus | |
US20170064203A1 (en) | Image processing apparatus, image processing method, and storage medium | |
CN111815517B (en) | Self-adaptive panoramic stitching method based on snapshot pictures of dome camera | |
CN110288511B (en) | Minimum error splicing method and device based on double camera images and electronic equipment | |
TWI639136B (en) | Real-time video stitching method | |
Alomran et al. | Feature-based panoramic image stitching | |
WO2014183385A1 (en) | Terminal and image processing method therefor | |
CN110505398B (en) | Image processing method and device, electronic equipment and storage medium | |
KR101853269B1 (en) | Apparatus of stitching depth maps for stereo images | |
Pulli et al. | Mobile panoramic imaging system | |
CN112365518A (en) | Image splicing method based on optimal suture line self-selection area gradual-in and gradual-out algorithm | |
WO2020038065A1 (en) | Image processing method, terminal, and computer storage medium | |
KR101938067B1 (en) | Method and Apparatus for Stereo Matching of Wide-Angle Images using SIFT Flow | |
JP2017050857A (en) | Image processor, image processing method and program | |
CN110675349B (en) | Endoscopic imaging method and device | |
CN110719405B (en) | Multi-camera panoramic image stitching method based on binocular ranging, storage medium and terminal | |
CN115619636A (en) | Image stitching method, electronic device and storage medium | |
CN116109484A (en) | Image splicing method, device and equipment for retaining foreground information and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |