WO2010101434A2 - Apparatus and method of generating panoramic image and computer-readable recording medium storing program for executing the method - Google Patents
Apparatus and method of generating panoramic image and computer-readable recording medium storing program for executing the method Download PDFInfo
- Publication number
- WO2010101434A2 WO2010101434A2 PCT/KR2010/001380 KR2010001380W WO2010101434A2 WO 2010101434 A2 WO2010101434 A2 WO 2010101434A2 KR 2010001380 W KR2010001380 W KR 2010001380W WO 2010101434 A2 WO2010101434 A2 WO 2010101434A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- regions
- matching region
- sub
- region
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000002156 mixing Methods 0.000 claims abstract description 59
- 238000012935 Averaging Methods 0.000 claims description 7
- 238000012886 linear function Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 description 30
- 238000010586 diagram Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 7
- 230000015654 memory Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 2
- 238000003702 image correction Methods 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- PWPJGUXAGUPAHP-UHFFFAOYSA-N lufenuron Chemical compound C1=C(Cl)C(OC(F)(F)C(C(F)(F)F)F)=CC(Cl)=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F PWPJGUXAGUPAHP-UHFFFAOYSA-N 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the present invention relates to an apparatus and method of generating a panoramic image and a computer-readable recording medium having embodied thereon a program for executing the method, and more particularly, to an apparatus and method of generating a panoramic image having a naturally blended matching region, and a computer-readable recording medium having embodied thereon a program for executing the method.
- SLR single-lens reflex
- digital optical devices such as a charge-coupled device (CCD) and a complementary metal oxide semiconductor (CMOS), are used to create digital images.
- CCD charge-coupled device
- CMOS complementary metal oxide semiconductor
- Image photographing apparatuses such as film-based image photographing apparatuses and digital image photographing apparatuses, create images by developing optical information introduced through optical devices, such as a lens, an iris, and a shutter, on a film or by converting the optical information into electrical energy by using an optical sensor.
- optical devices such as a lens, an iris, and a shutter
- the image photographing apparatuses are limited by a wide angle of the lens facing a subject.
- Panoramic images have been suggested as a way to solve the limitation and satisfy digital image photographing apparatuses users’ various needs.
- the term panoramic image refers to a wide field of view image that may not be taken with a lens but may be taken by using a special photographing technique, changing a focal point of the lens, and performing digital image processing.
- a panoramic image is a wide field of view image generated by connecting a plurality of separately picked images in rows, columns, or both.
- the methods require additional equipment, are greatly affected by subjective factors, such as a user’s operational method, and are not suitable for portable compact mobile terminals that provide an image capturing service.
- the methods and apparatuses generate a panoramic image by detecting a matching region between two or more images and combining the two or more images based on the matching region.
- the panoramic image generated by the methods and apparatuses is not natural in many cases.
- the distortion and the unnatural combination may be corrected by using a program executed in a computing device capable of high speed operation.
- a program executed in a computing device capable of high speed operation it is difficult to perfectly correct the distortion and the unnatural combination when using a mobile terminal having limited resources and limited operation.
- the present invention provides an apparatus and method of generating a natural panoramic image by naturally blending a matching region between original images.
- the present invention also provides a computer-readable recording medium having embodied thereon a program for executing the method.
- an apparatus for generating a panoramic image including: an image acquiring unit for sequentially acquiring a plurality of images; a matching region acquiring unit for acquiring a matching region, which is an overlapping region between a first image and a second image to be combined with the first image from among the plurality of images, wherein the matching region includes sub-regions obtained by dividing the matching region in a direction perpendicular to a combination direction in which the first image and the second image are combined; and a panorama generating unit for generating a panoramic image by blending a region of the first image corresponding to the matching region with a region of the second image corresponding to the matching region in units of the sub-regions by using a weight function that is defined for each of the sub-regions.
- the panorama generating unit may include: a matching region blending unit for obtaining a blending matching region by using color information of pixels in the regions of the first image and the second image corresponding to the matching region and weight values of the sub-regions calculated by the weight function; and a panorama combining unit for combining a region of the first image other than the matching region, the blending matching region, and a region of the second image other than the matching region into the panoramic image.
- the matching region blending unit may obtain the blending matching region by calculating a color information deviation by subtracting color information of pixels in regions of the first image corresponding to the sub-regions from color information of pixels in regions of the second image corresponding to the sub-regions, in units of the sub-regions, calculating a weighted color information deviation by multiplying weight values of the sub-regions by the color information deviation, and adding the weighted color information deviation to the color information of the pixels in the regions of the first image corresponding to the sub-regions.
- the matching region blending unit may obtain the blending matching region by weight averaging color information of pixels in regions of the first image corresponding to the sub-regions and color information of pixels in regions of the second image corresponding to the sub-regions with the weight values of the sub-regions, in units of the sub-regions.
- the panorama generating unit may include: an image line loading unit for sequentially loading in the combination direction lines that are composed of pixels of the first image and the second image and are perpendicular to the combination direction; a matching region determining unit for determining whether a loaded line is included in the matching region; and a panorama sequential generation unit for, if it is determined that the loaded line is not included in the matching region, inserting the loaded line into a position of the panoramic image corresponding to the loaded line, and if the loaded line is included in the matching region, loading a matching line of the second image matched to the loaded line, determining a final line of the loaded line by using a weight value of a sub-region in which the loaded line is included and which is calculated by the weight function, and inserting the final line into the position of the panoramic image corresponding to the loaded line.
- the number of the sub-regions may be equal to the number of lines of pixels of the matching region in the combination direction.
- the number of the sub-regions may be variable.
- the weight function is a monotonic function with input variables of 0 to L+1 (L is an integer)
- the weight function may have a value of 0 when the input variable is 0 and have a value of 1 when the input variable is L+1.
- the weight function may be a linear function.
- the apparatus may further include an image correcting unit for selecting a predetermined number of pixels having the same positions in the regions of the first image and the second image corresponding to the matching region, calculating averages of color information of the pixels, and multiplying a ratio of the averages by color information of all pixels included in the first image or the second image.
- an image correcting unit for selecting a predetermined number of pixels having the same positions in the regions of the first image and the second image corresponding to the matching region, calculating averages of color information of the pixels, and multiplying a ratio of the averages by color information of all pixels included in the first image or the second image.
- a method of generating a panoramic image including: sequentially acquiring a plurality of images; acquiring a matching region, which is an overlapping region between a first image and a second image to be combined with the first image from among the plurality of images, wherein the matching region includes sub-regions obtained by dividing the matching region in a direction perpendicular to a combination direction in which the first image and the second image are combined with each other; and generating a panoramic image by blending a region of the first image corresponding to the matching region and a region of the second image corresponding to the matching region in units of the sub-regions by using a weight function that is defined for each of the sub-regions.
- a computer-readable recording medium having embodied thereon a program for executing the method.
- the apparatus and method according to the present invention may generate a natural panoramic image by blending regions of original images corresponding to a matching region by using a weight value that is defined based on a distance of each of pixels.
- the apparatus and method according to the present invention may achieve natural combination between regions other than the matching region by readjusting auto white balance and exposure values by considering characteristics of an image photographing apparatus of a mobile terminal.
- the apparatus and method according to the present invention may generate a panoramic image further suitable for the hardware environment of a mobile terminal since a natural panoramic image is generated even with simple computation and limited resources.
- FIG. 1 is a block diagram of an apparatus for generating a panoramic image, according to an embodiment of the present invention
- FIG. 2 is a block diagram of a panorama generating unit of the apparatus of FIG. 1, according to an embodiment of the present invention
- FIG. 3 is a block diagram of a panorama generating unit of the apparatus of FIG. 1, according to another embodiment of the present invention.
- FIGS. 4 through 8 are schematic views for explaining a process of generating a panoramic image, according to an embodiment of the present invention.
- FIG. 9 is a flowchart illustrating a method of generating a panoramic image, according to an embodiment of the present invention.
- FIG. 10 is a block diagram illustrating a panorama generating operation of the method of FIG. 9, according to an embodiment of the present invention.
- FIG. 11 is a block diagram illustrating a panorama generating operation of the method of FIG. 9, according to another embodiment of the present invention.
- FIG. 12 is a flowchart illustrating an image correcting operation optionally included in the method of FIG. 9.
- FIGS. 13 and 14 illustrate panoramic images before and after blending.
- a color space of an image essential to image processing, may be expressed in various ways, for example, as red, green, blue (RGB), cyan, magenta, yellow, key black (CMYK), HS-family, the Commission Internationale d’Eclairage (CIE), and Y-family, according to color mixing or similarity to a visual system of human beings. It is obvious to one of ordinary skill in the art that a color space may be converted to another kind of color space by a simple mathematic conversion formula.
- An input image includes a plurality of pixels, and each of the pixels has its unique image information such as brightness, hue, and saturation.
- image information has values of 0 to 255 and is indicated as 8 bit information.
- the image information may be indicated as 10 bit or 12 bit information depending on application conditions.
- a color space coordinate system used as an example in the present invention may be applicable to another color space coordinate system equally or similarly, and a bit size of image information of a pixel in the input image is just an example used to describe the present invention.
- FIG. 1 is a block diagram of an apparatus 100 for generating a panoramic image, according to an embodiment of the present invention.
- the apparatus 100 includes an image matching unit 110, a matching region acquiring unit 120, and a panorama generating unit 130.
- the apparatus 100 may include an image correcting unit 140.
- the image acquiring unit 110 sequentially acquires a plurality of images which are to be combined into a panoramic image.
- the plurality of images include at least two original images to be combined.
- the image acquiring unit 110 may acquire images to be combined into a vertical panoramic image or a horizontal panoramic image.
- the image acquiring unit 110 may sequentially acquire images to be combined into a vertical and horizontal panoramic image such as 2x2, 2x3, or 3x3. In this case, however, there is no overlapping region between the images that are sequentially acquired. That is, a first image may overlap with a part of a third image or a fourth image.
- the plurality of images to be combined into a panoramic image may be received from an external device of the apparatus 100, or may be acquired by being directly captured by a camera unit (not shown) included in the apparatus 100.
- the matching region acquiring unit 120 acquires a matching region that is an overlapping region between a first image and a second image to be combined with the first image from among the plurality of images which are sequentially acquired by the image acquiring unit 110.
- the matching region may be acquired by extracting feature points of the first and second images, and causing coordinates of the first and second images to correspond to each other through pattern matching of the extracted feature points.
- the matching region may be divided into a plurality of sub-regions. As the number of the sub-regions increases, a more natural panoramic image is achieved.
- the panorama generating unit 130 generates a panoramic image by blending a region of the first image corresponding to the matching region and a region of the second image corresponding to the matching region in units of the sub-regions by using a weight function that is defined for each of the sub-regions.
- the panoramic image generated by the panorama generating unit 130 is natural in terms of color and distortion since color information of the first image and color information of the second image are smoothly integrated with each other.
- the image correcting unit 140 which is optionally included in the apparatus 100 performs image correction by selecting a predetermined number of pixels having the same positions in the regions of the first image and the second image corresponding to the matching region, calculating a first average and a second average by calculating averages of color information of the selected pixels, and applying a ratio of the first average to the second average to all pixels included in the first image or the second image.
- white balance and exposure values determined during capturing of the first image and the second image may be similar to each other, thereby making the panoramic image more natural.
- the image correcting unit 140 is optionally included in the apparatus 100, the image correcting unit 140 may be omitted without departing from the scope of the present invention.
- the image correcting unit 140 will be explained later in detail with reference to FIG. 12.
- FIG. 2 is a block diagram of a panorama generating unit 130a of the apparatus 100 of FIG. 1, according to an embodiment of the present invention.
- the panorama generating unit 130a includes a matching region blending unit 131 and a panorama combining unit 132.
- the matching region blending unit 131 uses color information of pixels in the regions of the first image and the second image corresponding to the matching region.
- the matching region blending unit 131 calculates a weight value of each of the sub-regions by using a weight function and uses the weight value of each of the sub-regions to blend the matching region.
- the matching region blending unit 131 may calculate a color information deviation by subtracting color information of pixels in regions of the first image corresponding to the sub-regions from color information of pixels in regions of the second image corresponding to the sub-regions, in units of the sub-regions. Next, the matching region blending unit 131 may calculate a weighted color information deviation by multiplying a weight value of each of the sub-regions, which is calculated by the weight function, by the color information deviation. Finally, the matching region blending unit 131 may obtain a blending matching region by adding the weighted color information deviation to the color information of the pixels in the regions of the first image corresponding to the sub-regions. The matching region blending unit 131 may obtain the blending matching region by performing the above process on all pixels included in all of the sub-regions.
- the matching region blending unit 131 may weight average color information of pixels in regions of the first image corresponding to the sub-regions and color information of pixels in regions of the second image corresponding to the sub-regions by using the weight values of the sub-regions, in units of the sub-regions, to obtain a weighted average value.
- the matching region blending unit 131 may allocate the weighted average value as a value of color information of pixels of a blending matching region.
- the matching region blending unit 131 may obtain the blending matching region by performing the above process on all pixels included in all of the sub-regions.
- the matching region blending unit 131 will be explained later in detail with reference to FIG. 4.
- the panorama combining unit 132 may combine a region of the first image other than the matching region, the blending matching region, and a region of the second image other than the matching region, to generate a panoramic image.
- FIG. 3 is a block diagram of a panorama generating unit 130b of the apparatus 100 of FIG. 1, according to another embodiment of the present invention.
- the panorama generating unit 130b includes an image line loading unit 135, a matching region determining unit 136, and a panorama sequential generation unit 137.
- the image line loading unit 135 may sequentially load lines composed of pixels of the first image and the second image.
- the lines may be perpendicular to a combination direction in which the first image and the second image are combined.
- the matching region determining unit 136 may determine whether a loaded line loaded by the image line loading unit 135 is included in the matching region.
- the panorama sequential generation unit 137 may insert the loaded line into a corresponding position of the panoramic image. If the loaded line is included in the matching region, the panorama sequential generation unit 137 may load a matching line of an image to be combined which corresponds to the loaded line. Next, the loaded line and the matching line are combined by using a sub-region in which the loaded line is included and a weighted value of the sub-region, to form a final line. The final line may be inserted into the corresponding position of the panoramic image.
- FIGS. 4 through 8 are schematic views for explaining a process of generating a panoramic image, according to an embodiment of the present invention.
- two images 302 and 304 are exemplarily illustrated from among a plurality of images sequentially acquired by the image acquiring unit 110.
- the image acquiring unit 110 may sequentially acquire a plurality of images to be combined into a panoramic image.
- the first image 302 is an arbitrary image acquired by the image acquiring unit 110
- the second image 304 is an image to be combined with the first image 302 in order to generate the horizontal panoramic image. It is assumed that each of the first image 302 and the second image 304 has a resolution of n x m.
- the scope of the present invention is not limited by a combination direction in which images are combined and the number of the images to be combined.
- the matching region acquiring unit 120 may acquire a matching region that is an overlapping region between the first image 302 and the second image 304.
- feature points for example, “312”, of the first image 302 and the second image 304, may be extracted.
- Feature points refer to regions or points that are distinguishable from surrounding regions or points.
- feature points may be points having the highest brightness compared to surrounding points, contact points between boundaries, or points having a pixel size different from those of surrounding points.
- Feature points may be determined in various extraction methods without departing from the scope of the present invention.
- Warping may be performed by extracting homography between the first image 302 and the second image 304 based on the feature points 312.
- Homography refers to a process of making a pixel coordinate system of one image the same as a pixel coordinate system of another image in order to combine the images. Distortion of the first image 302 and the second image 304 may be corrected due to the warping.
- one of the feature points 312 may be selected. Coordinates of the selected feature point in the first image 302 and the second image 304 may be obtained. For example, assuming that coordinates of the feature point 312 in the first image 302 are (x1, y1) and coordinates of the feature point 312 in the second image 304 are (x2, y2), a region of the first image 302 corresponding to a matching region 308 may be defined with (x1-x2, y1-y2), (x1-x2, m), (n, y1-y2), and (n, m).
- a coordinate system sets a left upper end of each image to (1, 1) and a right lower end of the image to (n, m). As shown in FIG. 4, it is assumed that the second image 304 is combined with a right portion of the first image 302.
- the matching region acquiring unit 120 may acquire the matching region 308 between the first image 302 and the second image 304 through the above process.
- the matching region 308 may be externally transmitted from an external device of the apparatus 100.
- a photographing unit (not shown) of the apparatus 100 may provide the matching region 308, which is previously determined, by capturing an image so that the first image 302 and the second image 304 partially overlap with each other.
- the first image 302 and the second image 304 are illustrated with the matching region 308.
- the matching region acquiring unit 120 may determine a combination direction in which the first image 302 and the second image 304 are combined with each other by using the matching region 308.
- a horizontal panoramic image mode if the matching region 308 is located on a right portion of the first image 302 as shown in FIG. 5, the second image 304 is to be combined with the right portion of the first image 302, and in this case, the combination direction is a rightward direction.
- a vertical panoramic image mode if the matching region 308 is located on a lower end portion of the first image 302, the second image 304 is to be combined with the lower end portion of the first image 302, and in this case, the combination direction may be a downward direction.
- the matching region 308 is composed of d x m pixels. That is, the matching region 308 includes “d” vertical lines each line composed of m pixels.
- the matching region 308 may include a plurality of sub-regions 314 that are obtained by dividing the matching region 308 in a direction perpendicular to the combination direction. If the combination direction is a rightward direction, the sub-regions 314 may be obtained by dividing the matching region 308 in a vertical direction. In FIGS. 1 and 5, it is assumed that the matching region 308 includes L sub-regions 314. Since a pixel is a minimum unit having color information, L is not greater than “d”.
- the number L of the sub-regions 314 is a constant that allows smooth combination. As the number L of the sub-regions 314 constituting the matching region 308 increases, smoother combination between the first image 302 and the second image 304 may be achieved. For example, if the number L of the sub-regions 314 is equal to the number “d” of lines of pixels in the combination direction, the smoothest combination between the first and second images 302 and 304 may be achieved.
- a plurality of images including at least two images to be combined with each other are acquired and a matching region which is an overlapping region between the two images is determined.
- the panorama generating unit 130 will now be explained in detail.
- the panorama generating unit 130 may be the panorama generating unit 130a illustrated in FIG. 2 or the panorama generating unit 130b illustrated in FIG. 3.
- the panorama generating unit 130a generates a panoramic image by performing an arithmetic operation on the matching region and then combining the matching region with non-matching regions 306 and 310.
- the panorama generating unit 130b may generate a panoramic image by sequentially loading lines composed of pixels of the first image 302 and the second image 304 and performing an arithmetic operation.
- the operation of the panorama generating unit 130b may be suitable for displaying the panoramic image on a screen.
- a weight function w(x) defined for each of the sub-regions 314 and a weight value of each of the sub-regions 314 calculated by using the weight function w(x) will be explained with reference to FIGS. 6 and 7.
- a weight value of each of the sub-regions 314 may be calculated by using the weight function w(x).
- the weight function w(x) which is a function with input variables of 0 to L+1 (L is an integer), may have a value equal to or greater than 0 and equal to or less than 1.
- the weight function w(x) may have a value of 0 when an input variable is 0, and may have a value of 1 when an input variable is L+1.
- the weight function w(x) may be a monotonic function whose value increases or is equal as the input variable increases.
- the weight function w(x) may be determined by a user’s selection. In order to reduce the amount of computation of the apparatus 100, a linear function may be determined as the weight function w(x).
- An input of the weight function w(x) may be a number of a sub-region, and a result of the input may be a weight value of the sub-region.
- a weight value of an ath sub-region is a result value of the weight function w(x) when “a” is an input to the weight function w(x), that is, w(a).
- “a” is obviously equal to or greater than 1 and equal to or less than L, and a number of a sub-region is assumed to be determined in the combination direction.
- a leftmost sub-region is a first sub-region
- a rightmost sub-region is an Lth sub-region.
- the matching region blending unit 131 of the panorama generating unit 130a of FIG. 2 blends regions of the first image 302 and the second image 304 corresponding to the matching region 308.
- the matching region blending unit 131 calculates a color information deviation by subtracting color information of all pixels included in the region of the first image 302 corresponding to the matching region 308 from color information values of all pixels included in the region of the second image 304 corresponding to the matching region 308. For example, if color information of a pixel P1 with coordinates (i, j) of the first image 302 corresponding to the matching region 308 is (r1, g1, b1) and color information of a pixel P2 with coordinates (i, j) of the second image 304 corresponding to the matching region 308 is (r2, g2, b2), a color information deviation may be (r2-r1, g2-g1, b2-b1).
- the coordinates (i, j) are determined by a coordinate system limited to the matching region 308, and it is assumed that a left upper end of the matching region 308 is (0, 0) and a right lower end of the matching region 308 is (d, m).
- a result obtained by inputting “a” to the weight function w(x), that is, a weight value w(a), is multiplied by the color information deviation.
- a weighted color information deviation w(a)(r2-r1, g2-g1, b2-b1) is obtained.
- a final matching region 318 may be obtained by adding the weighted color information deviation w(a)(r2-r1, g2-g1, b2-b1) to the color information (r1, g1, b1) of the pixel P1 with the coordinates (i, j) of the first image 302.
- Final color information of a pixel with the coordinates (I, j) of the final matching region 318 may be (r1+w(a)(r2-r1), g1+w(a)(g2-g1), b1+w(a)(b2-b1)).
- the above process is performed on all pixels included in the matching region 308. In order to reduce the amount of computation, the above process may be performed in units of sub-regions having the same weight value that is a result of the weight function w(x).
- Pixels included in an ith line of the final matching region 318 have final color information (r1+i(r2-r1)/(d+1), g1+i(g2-g1)/(d+1), b1+i(b2-b1)/(d+1)) according to the above formula.
- the matching region blending unit 131 may obtain final color information of the final matching region 318 by weight averaging color information of all pixels included in the region of the first image 302 corresponding to the matching region 308 and color information of all pixels included in the region of the second image 304 corresponding to the matching region 308 with the weight values of sub-regions in which the pixels are included.
- color information of a pixel with coordinates (i, j) of the matching region 308 may be calculated by weight averaging color information of a pixel with the coordinates (i, j) of the first image 302 and color information of a pixel with the coordinates (i, j) of the second image 304 with 1-w(a) and w(a), respectively.
- the amount of computation may be reduced since weight averaging is used and a difference between color information of pixels included in the first image 302 and the second image 304 does not need to be obtained.
- the above process may be performed in units of sub-regions having the same result of the weight function w(x) in order to reduce the amount of computation.
- RGB color coordinate system has been used to describe the present invention, the present invention is not limited thereto.
- the panorama combining unit 132 of FIG. 2 may generate a panoramic image 330 by combining the final matching region 318, which is obtained by the matching region blending unit 131, with the non-matching region 306 of the first image 302 other than the matching region 308 and with the non-matching region 310 of the second image 304 other than the matching region 308.
- portions near the subject that is, a right portion of the first image 302 and a left portion of the second image 304, appear to be larger than a left portion of the first image 302 and a right portion of the second image 304.
- the roof of a house in the first image 302 is inclined at a positive angle
- the roof of a house in the second image 304 is inclined at a negative angle.
- This is due to distortion that occurs when a three-dimensional (3D) space is displayed as a two-dimensional (2D) space, that is, a phenomenon where the same length appears to decrease as a distance from a lens of an image photographing apparatus increases.
- Such distortion may result in X-shaped overlapping or make a connected line appear to be a disconnected line, thereby leading to an unnatural panoramic image.
- a portion “A” of the final matching region 318 is slightly curved. Since a weight value is used to combine the first image 302 and the second image 304, the portion “A” does not appear to be X-shaped but appears to be curved. Even when a wide-angle lens having a wide viewing angle is used, since a straight line at a peripheral portion of an image appears to be curved, the distortion may be considered to be moderate.
- FIGS. 13 and 14 illustrate panoramic images before and after blending.
- the first image 302 and the second image 304 are separated with a clear line and the first image 302 and the second image 304 are mismatched. However, after blending, the first image 302 and the second image 304 are naturally combined.
- the panorama generating unit 130b of FIG. 3 will now be explained with reference to FIGS. 3, 4, 5, and 8.
- the image line loading unit 135 sequentially loads all pixels included in the first image 302 and the second image 304 in units of lines.
- the lines are perpendicular to the combination direction in which the first image 302 and the second image 304 are combined and may be loaded sequentially in the combination direction. Referring to FIG. 5, a leftmost vertical line composed of leftmost “m” pixels of the first image 302 is first loaded, and then an adjacent right vertical line is loaded. In the same manner, lines are loaded in the order of the non-matching region 306 of the first image 302, the matching region 308 of the first image 302, the matching region 308 of the second image 304, and the non-matching region 310 of the second image 304.
- the matching region determining unit 136 determines whether a loaded line loaded by the image line loading unit 135 is included in the matching region 308.
- Information about the matching region 308 may be acquired by the matching region acquiring unit 120 illustrated in FIG. 1.
- the panorama sequential generation unit 137 may load a matching line of the second image 304 matched to the loaded line.
- a final matching line may be obtained by using a weight value of a sub-region 314 in which the loaded line is included.
- a panoramic image may be generated by inserting the final matching line into a position corresponding to the loaded line.
- a panoramic image may be generated by inserting the loaded line into the position corresponding to the loaded line.
- the image line loading unit 135 loads pixels in the non-matching region 310.
- the panorama generating unit 130b may generate a panorama image by performing the above process on all pixels.
- the panorama generating unit 130b of FIG. 3 may sequentially generate the panoramic image 330 at the same time as the final matching region 318 is obtained, and may not require a great amount of computation or high capacity memory resources.
- the panorama generating unit 130b of FIG. 3 may be particularly suitable for displaying the panoramic image 330 on a display device in real time or transmitting data to another device. This is because the matching region 308 may be blended and displayed on a screen at the same time.
- the image correcting unit 140 illustrated in FIG. 1 will now be explained.
- the image correcting unit 140 may be disposed between the matching region acquiring unit 120 and the panorama generating unit 130 and correct an image before generating a panoramic image.
- the image correcting unit 140 selects a predetermined number of pixels having matching positions in the regions of the first image 302 and the second image 304 corresponding to the matching region 308.
- the predetermined number of pixels may be previously determined pixels included in the matching region 308, for example, pixels of a central region. Alternatively, the predetermined number of pixels may be selected randomly. Alternatively, the predetermined number of pixels may be all pixels included in the matching region 308.
- the pixels may have the same color information in the first image 302 and the second image 304.
- color information may not be exactly the same.
- white balance and exposure values may be different between the first image 302 and the second image 304 even though the first image 302 and the second image 304 are obtained by photographing the same subject. The difference in the white balance and exposure values leads to a sense of difference between the first image 302 and the second image 304.
- the image correcting unit 140 calculates a first color average and a second color average by averaging color information of the selected predetermined number of pixels in the first image 302 and the second image 304.
- the first color average of the selected predetermined number of pixels may be (R1, G1, B1) and the second color average may be (R2, G2, and B2).
- the second image 304 may be brighter than the first image 302 and pixels of the second image 304 may have higher color information, for example, RGB. That is, color information R2 + G2 + B2 of the selected predetermined number of pixels in the second image 304 may be greater than color information R1 + G1 + B1 in the first image 302.
- pixels of the second image 304 may have lower blue information. That is, blue information B2 of the selected predetermined number of pixels in the second image 304 may be lower than blue information B1 in the first image 302.
- the image correcting unit 140 may calculate a color ratio or a color deviation which is a difference between the first color average and the second color average.
- a color deviation may be (R1-R2, G1-G2, B1-B2) and a color ratio may be (R1/R2, G1/G2, B1/B2), or vise-versa.
- the image correcting unit 140 may apply the color deviation or the color ratio to color information of all pixels included in the first image 302 or the second image 304.
- white balance and exposure values of the second image 304 may be corrected to be equal to white balance and exposure values of the first image 302 by adding the color deviation to color information of all pixels included in the second image 304.
- white balance and exposure values of the first image 302 may be corrected to be equal to white balance and exposure values of the second image 304 by subtracting the color deviation from color information of all pixels included in the first image 302.
- white balance and exposure values of the first image 302 and the second image 304 may be equal by multiplying or dividing color information of all pixels included in the first image 302 or the second image 304 by the color ratio.
- the panoramic image may be more natural.
- RGB color coordinate system has been used to describe the present invention, it is to be understood by one of ordinary skill in the art that other color coordinate systems may be used.
- FIG. 9 is a flowchart illustrating a method of generating a panoramic image, according to an embodiment of the present invention.
- the image acquiring unit 110 acquires a plurality of images including a first image and a second image having an overlapping region therebetween, which has been explained above in detail and thus will not be explained again here.
- the matching region acquiring unit 120 acquires a matching region that is the overlapping region between the first image and the second image.
- the matching region includes a plurality of sub-regions that is obtained by dividing the matching region in a direction perpendicular to a combination direction in which the first image and the second image are combined, which has been explained above in detail and thus will not be explained again here.
- the image correcting unit 140 may correct the first image or the second image.
- a predetermined number of pixels having the same positions in regions of the first image and the second image corresponding to the matching region may be selected, averages of color information of the pixels may be respectively calculated, and a ratio of the averages may be applied to all pixels included in the first image or the second image.
- the application may be performed by multiplying or dividing the first image or the second image by the ratio, which has been explained above in detail and thus will not be explained again here.
- the panorama generating unit 130 may generate a panoramic image by combining the first image and the second image, which has been explained above in detail and thus will not be explained again here.
- FIG. 10 is a flowchart illustrating the operation S30 of the method of FIG. 9, according to an embodiment of the present invention.
- a panorama generating operation S30a may include a matching region blending operation S31 and a panorama combining operation S32.
- the matching region blending unit 131 may obtain a blending matching region by blending regions of the first image and the second image corresponding to the matching region, which has been explained above in detail and thus will not be explained again here.
- the panorama combining unit 132 may generate a panoramic image by combining the blending matching region with non-matching regions of the first image and the second image, which has been explained above in detail and thus will not be explained again here.
- FIG. 11 is a flowchart illustrating the operation S30 of the method of FIG. 9, according to another embodiment of the present invention.
- a panorama generating operation S30b may include an image loading operation, a matching region determining operation, and a panorama sequential generation operation.
- the image loading unit 135 may sequentially load all pixels included in the first image and the second image in units of lines.
- the matching region determining unit 136 may determine whether a loaded line is included in the matching region.
- the method proceeds to operation S37.
- the panorama sequential generation unit 137 may load a matching line of the second image matched to the loaded line.
- a final matching line may be obtained by using a weight value of the loaded line.
- a panoramic image may be generated by inserting the final matching line into a position corresponding to the loaded line.
- the method proceeds to operation S39.
- the panorama sequential generation unit 137 may generate a panoramic image by inserting the loaded line into the position corresponding to the loaded line.
- the panorama generating operation S30b after all pixels included in the first image are loaded, all pixels included in a non-matching region of the second image are loaded.
- the above process may be performed on all pixels to generate a panoramic image.
- FIG. 12 is a flowchart illustrating operation S40 optionally included in the method of FIG. 9.
- the operation S40 may include an operation S42 in which a predetermined number of pixels having matching positions are selected in the first image and the second image.
- a first average which is an average of color information of the selected pixels in the first image
- a second average which is an average of color information of the selected pixels in the second image
- a ratio or a deviation between the first and second averages is calculated.
- the ratio or the deviation between the first and second averages is applied to the first image or the second image.
- the present invention may be embodied as computer-readable codes on a computer-readable recording medium.
- the computer-readable recording medium is any data storage device that may store data which may be thereafter read by a computer system. Examples of the computer-readable recording medium include read-only memories (ROMs), random-access memories (RAMs), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices.
- ROMs read-only memories
- RAMs random-access memories
- CD-ROMs compact discs
- magnetic tapes magnetic tapes
- floppy disks floppy disks
- optical data storage devices optical data storage devices.
- the computer-readable recording medium may be installed in a computer system connected to a network, and stored and executed as a computer-readable code in a distributed computing environment. Functional programs, codes, and code segments for embodying the present invention may be easily derived by programmers in the art to which the present invention belongs.
- the present invention relates to an apparatus and method of generating a panoramic image and a computer-readable recording medium having embodied thereon a program for executing the method, and more particularly, to an apparatus and method of generating a panoramic image having a naturally blended matching region, and a computer-readable recording medium having embodied thereon a program for executing the method.
- the apparatus and method according to the present invention may generate a natural panoramic image by blending regions of original images corresponding to a matching region by using a weight value that is defined based on a distance of each of pixels. Also, the apparatus and method according to the present invention may generate a panoramic image further suitable for the hardware environment of a mobile terminal since a natural panoramic image is generated even with simple computation and limited resources.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010800103173A CN102342092A (zh) | 2009-03-05 | 2010-03-05 | 生成全景图像的装置、方法及由记录有运行该方法的程序的计算机可读取的记录介质 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020090018915A KR100968378B1 (ko) | 2009-03-05 | 2009-03-05 | 파노라마 이미지를 생성하는 장치, 방법 및 그 방법을 실행하는 프로그램이 기록된 기록 매체 |
KR10-2009-0018915 | 2009-03-05 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2010101434A2 true WO2010101434A2 (en) | 2010-09-10 |
WO2010101434A3 WO2010101434A3 (en) | 2010-11-04 |
Family
ID=42645252
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2010/001380 WO2010101434A2 (en) | 2009-03-05 | 2010-03-05 | Apparatus and method of generating panoramic image and computer-readable recording medium storing program for executing the method |
Country Status (3)
Country | Link |
---|---|
KR (1) | KR100968378B1 (zh) |
CN (1) | CN102342092A (zh) |
WO (1) | WO2010101434A2 (zh) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160189379A1 (en) * | 2014-12-25 | 2016-06-30 | Vivotek Inc. | Image calibrating method for stitching images and related camera and image processing system with image calibrating function |
CN105869119A (zh) * | 2016-05-06 | 2016-08-17 | 安徽伟合电子科技有限公司 | 一种动态视频获取方法 |
US9766533B2 (en) | 2014-02-12 | 2017-09-19 | Samsung Electronics Co., Ltd. | Flash device, and imaging method |
GB2555585A (en) * | 2016-10-31 | 2018-05-09 | Nokia Technologies Oy | Multiple view colour reconstruction |
JP2020195051A (ja) * | 2019-05-28 | 2020-12-03 | 池上通信機株式会社 | 撮像装置、及び撮像制御方法 |
US11017508B2 (en) * | 2016-10-12 | 2021-05-25 | Lg Innotek Co., Ltd. | Image matching method and apparatus |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5754312B2 (ja) | 2011-09-08 | 2015-07-29 | カシオ計算機株式会社 | 画像処理装置及び画像処理方法、並びにプログラム |
CN102708558A (zh) * | 2012-06-01 | 2012-10-03 | 惠州华阳通用电子有限公司 | 视频图像拼接装置、拼接方法以及视频监视系统 |
CN103997633B (zh) * | 2014-06-04 | 2016-03-23 | 武汉烽火众智数字技术有限责任公司 | 一种多通道ccd图像拼接非均匀性自动校正的方法 |
KR101579005B1 (ko) * | 2014-08-08 | 2015-12-21 | 중앙대학교 산학협력단 | 영상생성장치 및 영상생성방법 |
CN104461438B (zh) * | 2014-12-29 | 2017-09-15 | 广东欧珀移动通信有限公司 | 显示控制的方法、装置及移动终端 |
KR102334742B1 (ko) * | 2015-03-04 | 2021-12-02 | 한화테크윈 주식회사 | 영상 합성 장치 및 방법 |
KR20170032761A (ko) * | 2015-09-15 | 2017-03-23 | 엘지전자 주식회사 | 이동 단말기 |
CN105957009B (zh) * | 2016-05-06 | 2019-05-07 | 安徽伟合电子科技有限公司 | 一种基于插值过渡的图像拼接方法 |
CN105931188A (zh) * | 2016-05-06 | 2016-09-07 | 安徽伟合电子科技有限公司 | 一种基于均值去重的图像拼接方法 |
CN106023073A (zh) * | 2016-05-06 | 2016-10-12 | 安徽伟合电子科技有限公司 | 一种图像拼接系统 |
CN105976320A (zh) * | 2016-05-06 | 2016-09-28 | 安徽伟合电子科技有限公司 | 一种图像拼接方法 |
CN105976319A (zh) * | 2016-05-06 | 2016-09-28 | 安徽伟合电子科技有限公司 | 一种应用于图像拼接的交界重现方法 |
CN107220955A (zh) * | 2017-04-24 | 2017-09-29 | 东北大学 | 一种基于重叠区域特征点对的图像亮度均衡方法 |
EP3487162B1 (en) * | 2017-11-16 | 2021-03-17 | Axis AB | Method, device and camera for blending a first and a second image having overlapping fields of view |
KR102076635B1 (ko) * | 2018-10-16 | 2020-02-12 | (주)캠시스 | 산재된 고정 카메라를 이용한 파노라마 영상 생성 장치 및 방법 |
CN111798540B (zh) * | 2020-05-25 | 2023-03-31 | 青海大学 | 图像融合方法和系统 |
CN114730548B (zh) * | 2020-10-27 | 2023-12-12 | 京东方科技集团股份有限公司 | 拼接屏白平衡调整的方法和装置、电子设备、介质 |
US20220405987A1 (en) * | 2021-06-18 | 2022-12-22 | Nvidia Corporation | Pixel blending for neural network-based image generation |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100678208B1 (ko) * | 2005-07-08 | 2007-02-02 | 삼성전자주식회사 | 휴대단말기의 이미지 저장 및 표시방법 |
KR100724134B1 (ko) * | 2006-01-09 | 2007-06-04 | 삼성전자주식회사 | 이미지 매칭 속도와 블렌딩 방법을 개선한 파노라마 영상제공 방법 및 장치 |
CN101034253A (zh) * | 2007-04-12 | 2007-09-12 | 华为技术有限公司 | 实现拍摄全景照的装置及方法 |
KR100866278B1 (ko) * | 2007-04-26 | 2008-10-31 | 주식회사 코아로직 | 파노라마 영상 생성 장치, 방법 및 상기 방법을프로그램화하여 수록한 컴퓨터로 읽을 수 있는 기록매체 |
-
2009
- 2009-03-05 KR KR1020090018915A patent/KR100968378B1/ko active IP Right Grant
-
2010
- 2010-03-05 WO PCT/KR2010/001380 patent/WO2010101434A2/en active Application Filing
- 2010-03-05 CN CN2010800103173A patent/CN102342092A/zh active Pending
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9766533B2 (en) | 2014-02-12 | 2017-09-19 | Samsung Electronics Co., Ltd. | Flash device, and imaging method |
US20160189379A1 (en) * | 2014-12-25 | 2016-06-30 | Vivotek Inc. | Image calibrating method for stitching images and related camera and image processing system with image calibrating function |
US9716880B2 (en) * | 2014-12-25 | 2017-07-25 | Vivotek Inc. | Image calibrating method for stitching images and related camera and image processing system with image calibrating function |
CN105869119A (zh) * | 2016-05-06 | 2016-08-17 | 安徽伟合电子科技有限公司 | 一种动态视频获取方法 |
US11017508B2 (en) * | 2016-10-12 | 2021-05-25 | Lg Innotek Co., Ltd. | Image matching method and apparatus |
GB2555585A (en) * | 2016-10-31 | 2018-05-09 | Nokia Technologies Oy | Multiple view colour reconstruction |
JP2020195051A (ja) * | 2019-05-28 | 2020-12-03 | 池上通信機株式会社 | 撮像装置、及び撮像制御方法 |
JP7360819B2 (ja) | 2019-05-28 | 2023-10-13 | 池上通信機株式会社 | 撮像装置、及び撮像制御方法 |
Also Published As
Publication number | Publication date |
---|---|
WO2010101434A3 (en) | 2010-11-04 |
CN102342092A (zh) | 2012-02-01 |
KR100968378B1 (ko) | 2010-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010101434A2 (en) | Apparatus and method of generating panoramic image and computer-readable recording medium storing program for executing the method | |
WO2020116981A1 (en) | Image sensor for generating depth data by a path difference of light generated through micro lens covering a plurality of sub-pixels and electronic device including the image sensor | |
WO2015083971A1 (en) | Electronic apparatus and method of controlling the same | |
WO2015141925A1 (en) | Photographing apparatus, method of controlling the same, and computer-readable recording medium | |
WO2009131382A2 (en) | Apparatus and method for correcting moving image wavering | |
WO2016021790A1 (en) | Imaging sensor capable of detecting phase difference of focus | |
WO2013073850A1 (en) | Digital photographing apparatus and method of controlling the same | |
WO2019164185A1 (en) | Electronic device and method for correcting image corrected in first image processing scheme in external electronic device in second image processing scheme | |
WO2014189332A1 (en) | Imaging sensor capable of phase difference focus detection cross-reference to related patent application | |
WO2016076497A1 (ko) | 메타 데이터에 기초하여 영상을 디스플레이하는 방법 및 디바이스, 그에 따른 기록매체 | |
WO2013089370A1 (en) | Image pickup apparatus, method of performing image compensation, and computer readable recording medium | |
US5644359A (en) | Image input apparatus having a white balance correction means and a method of inputting an image using the image input apparatus | |
WO2021101162A1 (ko) | 딥 화이트-밸런싱 편집을 위한 방법 및 장치 | |
EP4046199A1 (en) | Electronic device comprising image sensor and method of operation thereof | |
WO2013077522A1 (en) | Apparatus and method for hierarchical stereo matching | |
WO2022103121A1 (en) | Electronic device for estimating camera illuminant and method of the same | |
JPH10200804A (ja) | カメラ型スキャナ | |
WO2019088407A1 (ko) | 보색관계의 필터 어레이를 포함하는 카메라 모듈 및 그를 포함하는 전자 장치 | |
WO2021162307A1 (ko) | 전자 장치 및 그의 hdr 영상 생성 방법 | |
WO2020190030A1 (en) | Electronic device for generating composite image and method thereof | |
WO2020171450A1 (ko) | 뎁스 맵을 생성하는 전자 장치 및 방법 | |
EP4101161A1 (en) | Multi-frame depth-based multi-camera relighting of images | |
WO2018038300A1 (ko) | 이미지 제공 장치, 방법 및 컴퓨터 프로그램 | |
WO2017179912A1 (ko) | 삼차원 정보증강 비디오 씨쓰루 디스플레이 장치 및 방법, 렉티피케이션 장치 | |
WO2019245208A1 (ko) | 이미지의 광원의 타입을 결정하는 전자 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080010317.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10748976 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10748976 Country of ref document: EP Kind code of ref document: A2 |