WO2014064783A1 - 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 - Google Patents
画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 Download PDFInfo
- Publication number
- WO2014064783A1 WO2014064783A1 PCT/JP2012/077491 JP2012077491W WO2014064783A1 WO 2014064783 A1 WO2014064783 A1 WO 2014064783A1 JP 2012077491 W JP2012077491 W JP 2012077491W WO 2014064783 A1 WO2014064783 A1 WO 2014064783A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- input
- unit
- reference image
- target image
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 80
- 238000003672 processing method Methods 0.000 title claims description 9
- 239000002131 composite material Substances 0.000 claims abstract description 22
- 238000000034 method Methods 0.000 claims description 73
- 238000006243 chemical reaction Methods 0.000 claims description 57
- 230000006870 function Effects 0.000 claims description 35
- 238000003384 imaging method Methods 0.000 claims description 25
- 238000005457 optimization Methods 0.000 claims description 8
- 235000008694 Humulus lupulus Nutrition 0.000 claims description 3
- 230000009466 transformation Effects 0.000 abstract description 24
- 230000008569 process Effects 0.000 description 56
- 239000011159 matrix material Substances 0.000 description 26
- 238000010586 diagram Methods 0.000 description 20
- 230000015572 biosynthetic process Effects 0.000 description 18
- 238000003786 synthesis reaction Methods 0.000 description 18
- 238000003860 storage Methods 0.000 description 13
- 238000009825 accumulation Methods 0.000 description 10
- 230000002194 synthesizing effect Effects 0.000 description 10
- 230000009467 reduction Effects 0.000 description 8
- 238000005304 joining Methods 0.000 description 5
- 239000000203 mixture Substances 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/14—Transformations for image registration, e.g. adjusting or mapping for alignment of images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/44—Morphing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/38—Registration of image sequences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
Definitions
- the present invention relates to an image processing apparatus, an image processing method, an image processing program, and a recording medium.
- Patent Literature 1 an apparatus that creates a panoramic still image that is a single wide-angle still image by connecting captured images is known (see, for example, Patent Document 1).
- the image processing apparatus described in Patent Literature 1 combines a plurality of images obtained by imaging different directions from the same point. Specifically, both images are synthesized after unifying the coordinate systems of the two images using a transformation matrix. This transformation matrix is calculated by the least square method.
- the components of the conversion matrix are not particularly limited, for example, a conversion matrix that minimizes the position error between two images after conversion using the least square method is used. As a result, a matrix that changes the size of the image in the reduction direction tends to be a solution.
- the image processing apparatus described in Patent Document 1 when inputting target images picked up in different orientations sequentially and generating a panoramic image by sequentially synthesizing the input reference image and the target image, The reduced target images are reduced and combined, and the transformation matrix of the next target image is estimated based on the reduced target images. For this reason, errors are accumulated at the time of synthesis, and as a result, a high-quality panoramic image may not be obtained.
- an image processing apparatus is an image processing apparatus that sequentially inputs images picked up by an image pickup device and generates a composite image by joining the images each time they are input.
- the apparatus includes an input unit, a selection unit, a matching unit, an estimation unit, and a synthesis unit.
- the input unit sequentially inputs images.
- the selection unit selects a reference image from input images including one or a plurality of images input by the input unit before the target image that is a processing target image newly input by the input unit.
- the matching unit calculates a correspondence relationship between the feature points of the reference image and the feature points of the target image.
- the estimation unit assumes that the movement between the reference image and the target image is only the rotational movement of the image sensor, and uses the positional information of the feature point pair whose correspondence is calculated by the matching unit, A conversion formula that matches the coordinate system of the target image is estimated.
- the synthesizing unit generates a synthesized image by connecting the reference image and the target image based on the conversion formula.
- a conversion formula that correlates the coordinate systems of both images is estimated on the assumption that the movement between the reference image and the target image is caused only by the rotational movement of the image sensor. For this reason, the conversion formula does not include parameters such as enlargement, reduction, and parallel movement, so that it is possible to avoid the occurrence of an error due to reduction of the input target image. Furthermore, by limiting to only the rotation component, it is possible to avoid the target image that has been reduced or the like as a reference image for the next time or later, and therefore it is possible to avoid accumulation of errors. Therefore, even when the input images are sequentially stitched together, even if images with different imaging orientations are included, accumulation of errors can be suppressed and a high-quality panoramic image can be obtained.
- the estimation unit may estimate a conversion formula consisting only of rotational components of each axis of a three-dimensional coordinate system with the position of the image sensor as the origin.
- the estimation unit prepares an objective function including a difference in position information of each pair of feature points converted using the conversion formula, and uses the optimization method so that the objective function becomes the minimum value.
- the conversion formula may be estimated by performing a convergence calculation.
- the estimation unit uses an image input by the input unit immediately before the target image as an initial value of the convergence calculation.
- a conversion formula may be adopted.
- the estimation unit projects the feature point pair whose correspondence has been calculated by the matching unit onto a spherical surface, and uses the positional information of the projected feature point pair and the reference image coordinate system and the target image. You may estimate the conversion formula which makes a coordinate system correspond. By configuring in this way, the coordinates after using the conversion formula can be made into a form that does not include variable division, so that the calculation cost can be suppressed and the calculation speed can be improved.
- the estimation unit includes a plurality of images input by the input unit before the target image, and at least one of the images input by the input unit before the target image is a reference image.
- the reference image, the image overlapped with the reference image, and a pair of feature points of the target image are used to correspond the coordinate system of the reference image with the coordinate system of the image overlapped with the reference image.
- An equation and a conversion equation that associates the coordinate system of the reference image with the coordinate system of the target image may be estimated in association with each other. With this configuration, the positional relationship between these images can be estimated in consideration of not only the positional relationship between the reference image and the target image but also the positional relationship between the reference image and other images. The accuracy of the conversion formula can be improved.
- the selection unit selects a target image that is a target image that is newly input from the next time by the input unit. It may be selected as a reference image. By selecting in this way, a reference image having a large overlapping area with the target image can be selected.
- the selection unit when the distance between the target image and the past reference image is smaller than the distance between the target image and the reference image, the selection unit newly inputs the past reference image after the next time. You may select as a reference
- the matching unit further calculates a correspondence relationship between the feature point of the past reference image and the feature point of the target image when the distance between the target image and the past reference image is a predetermined value or less
- the estimation unit further estimates a conversion formula that associates the coordinate system of the past reference image with the coordinate system of the target image using the position information of the feature point pairs of the past reference image and the target image
- the reference image and the past reference image are made to correspond to each other by using a conversion formula that associates the coordinate system of the target image with the coordinate system of the target image, and a conversion formula that matches the coordinate system of the past reference image and the coordinate system of the target image. May be.
- the image processing apparatus further includes a guide unit that is connected to a display unit that displays an image and displays a guide display that guides a user's camera operation on the display unit.
- the image is linked and the relative position is recorded as a determined pair
- the combining unit outputs the combined image to the display unit
- the guide unit is connected to the current reference image of the pair whose relative position has been determined.
- a guidance display for guiding the imaging position from the current imaging position to the image position of the first image is displayed on the display unit. You may let them.
- An image processing method is an image processing method for sequentially inputting images picked up by an image sensor and connecting them each time they are input to generate a composite image.
- the method includes an input step, a selection step, a matching step, an estimation step, and a synthesis step.
- images are sequentially input.
- a selection step a reference image is selected from input images including one or a plurality of images input in the input step before the target image that is a processing target image newly input in the input step.
- the matching step a correspondence relationship between the feature points of the reference image and the feature points of the target image is calculated.
- the estimation step it is assumed that the movement between the reference image and the target image is only the rotational movement of the image sensor, and the coordinate system of the reference image is calculated using the position information of the feature point pair whose correspondence is calculated by the matching step.
- a conversion formula that matches the coordinate system of the target image is estimated.
- the synthesis step the reference image and the target image are connected based on the conversion formula to generate a synthesized image.
- An image processing program is an image processing program for causing a computer to function so as to sequentially input images picked up by an image sensor and connect them each time it is input to generate a composite image.
- the program causes the computer to function as an input unit, a selection unit, a matching unit, an estimation unit, and a synthesis unit.
- the input unit sequentially inputs images.
- the selection unit selects a reference image from input images including one or a plurality of images input by the input unit before the target image that is a processing target image newly input by the input unit.
- the matching unit calculates a correspondence relationship between the feature points of the reference image and the feature points of the target image.
- the estimation unit assumes that the movement between the reference image and the target image is only the rotational movement of the image sensor, and uses the positional information of the feature point pair whose correspondence is calculated by the matching unit, A conversion formula that matches the coordinate system of the target image is estimated.
- the synthesizing unit generates a synthesized image by connecting the reference image and the target image based on the conversion formula.
- a recording medium is a computer-readable recording of an image processing program that causes a computer to sequentially input images picked up by an image pickup device and connect them each time it is input to generate a composite image. It is a possible recording medium.
- the program causes the computer to function as an input unit, a selection unit, a matching unit, an estimation unit, and a synthesis unit.
- the input unit sequentially inputs images.
- the selection unit selects a reference image from input images including one or a plurality of images input by the input unit before the target image that is a processing target image newly input by the input unit.
- the matching unit calculates a correspondence relationship between the feature points of the reference image and the feature points of the target image.
- the estimation unit assumes that the movement between the reference image and the target image is only the rotational movement of the image sensor, and uses the positional information of the feature point pair whose correspondence is calculated by the matching unit, A conversion formula that matches the coordinate system of the target image is estimated.
- the combining unit combines the input image and the target image based on the conversion formula to generate a combined image.
- an image processing apparatus when images that have been input are sequentially stitched together, even if images with different imaging orientations are included, error accumulation is suppressed, and a high-quality panorama is achieved.
- the image processing apparatus is an apparatus that sequentially creates a single image by joining input images every time it is input. For example, a plurality of continuously captured images are joined in real time. This is suitably employed when generating a panoramic image having a wider angle than the captured image.
- the image processing apparatus according to the present embodiment is preferably mounted on a mobile terminal with limited resources such as a mobile phone, a digital camera, and a PDA (Personal Digital Assistant), but is not limited thereto. For example, it may be mounted on a normal computer system.
- a mobile terminal having a camera function will be described as an example of the image processing apparatus according to the present invention in consideration of ease of understanding.
- FIG. 1 is a functional block diagram of a mobile terminal 2 including an image processing apparatus 1 according to the present embodiment.
- a mobile terminal 2 shown in FIG. 1 is a mobile terminal carried by a user, for example, and has a hardware configuration shown in FIG.
- FIG. 2 is a hardware configuration diagram of the mobile terminal 2.
- the portable terminal 2 physically includes a main storage device such as a CPU (Central Processing Unit) 100, a ROM (Read Only Memory) 101, and a RAM (Random Access Memory) 102, a camera, a keyboard, and the like.
- the input device 103, the output device 104 such as a display, the auxiliary storage device 105 such as a hard disk, and the like are configured as a normal computer system.
- Each function of the portable terminal 2 and the image processing apparatus 1 to be described later causes the input device 103 and the output device 104 to be controlled under the control of the CPU 100 by reading predetermined computer software on hardware such as the CPU 100, the ROM 101, and the RAM 102. This is realized by operating and reading and writing data in the main storage device and the auxiliary storage device 105.
- the image processing apparatus 1 normally includes a CPU 100, a main storage device such as the ROM 101 and the RAM 102, an input device 103, an output device 104, an auxiliary storage device 105, and the like. It may be configured as a computer system.
- the mobile terminal 2 may include a communication module or the like.
- the portable terminal 2 includes a camera 20, an image processing device 1, and a display unit 21.
- the camera 20 has a function of capturing an image.
- an image sensor or the like is used as the camera 20.
- the camera 20 has a continuous imaging function that repeatedly captures images at a predetermined interval from a timing specified by a user operation or the like, for example.
- the user can take a continuous image that overlaps at least in the vertical and horizontal directions by sliding the camera 20 or rotating the camera 20 around a predetermined position.
- the camera 20 has a function of outputting a captured image to the image processing apparatus 1 every time it is captured.
- the display unit 21 is a display device that can display a composite image and the like.
- the image processing apparatus 1 has a function of sequentially joining captured images to generate a wide-angle panoramic composite image.
- the viewing angle of a normal camera is about 50 to 65 degrees (diagonal angle of view).
- the image processing apparatus 1 has a function described later so that images with an angle of view of 65 degrees or more can be joined by joining input images.
- Generate For example, as shown in FIG. 3, when the imaging direction of the camera 20 changes as indicated by the arrow K, the sequentially input images are connected and the combined images are sequentially drawn on the combining plane Sp. For example, if the image input this time is Ic, the image Ic is combined with the previous combined image It to generate a single combined image.
- the image processing apparatus 1 does not simply connect the target image Ic, which is the processing target to be combined, to the composite image It, but also connects it after performing deformation processing.
- the image processing apparatus 1 (A) enlarges / reduces the target image Ic, (B) a parallelogram (horizontal direction), (C) a parallelogram (vertical direction), and (D). Rotation, (E) Parallel movement (horizontal direction), (F) Parallel movement (vertical direction), (G) Trapezoid (horizontal direction), (H) Trapezoid (vertical direction).
- the target image Ic that has undergone these deformation processes or a combination of the deformation processes is drawn on the composite plane Sp.
- the image processing apparatus 1 includes an input unit 10, a selection unit 11, a matching unit 12, an estimation unit 13, a synthesis unit 14, and a guide unit 15.
- the input unit 10 has a function of inputting an image captured by the camera 20.
- the input unit 10 has a function of inputting, for example, an image captured by the camera 20 every time it is captured. Further, the input unit 10 has a function of saving the first input image in a first temporary storage area (output image buffer) provided in the mobile terminal 2. Further, the input unit 10 has a function of saving images input continuously from the next time in a second temporary storage area (input image buffer) provided in the mobile terminal.
- a second temporary storage area input image is stored in the second temporary storage area.
- positioning of the image is performed and whether or not the image is a drawing target is determined.
- the output image stored in the first temporary storage area is updated by synthesis and overwritten.
- the image stored in the first temporary storage area is described as a composite image It, and the image stored in the second temporary storage area is described as a target image Ic (input image).
- the selection unit 11 has a function of selecting a reference image for alignment.
- the reference image is an image serving as a reference for alignment of the target image Ic.
- the selection unit 11 is configured to be able to refer to a memory 18 that stores information related to an input image.
- the input image is an image input by the input unit 10 before the target image Ic, and may be one or a plurality of images. That is, if the n-th target image Ic is Ic (n ⁇ 1), the input images are Ic (n ⁇ 2), Ic (n ⁇ 3),... Ic0. When there is one input image, the selection unit 11 selects the image Ic1 as the reference image Ir0 of the target image Ic2.
- the predetermined condition is when the distance between the reference image and the target image Ic (n) is a predetermined value or more.
- the selection unit 11 selects the target image Ic (n) as the reference image Ir (k) of the next new target image Ic (n + 1), and stores information on the target image Ic (n) in the memory 18. save.
- the information related to the target image Ic (n) may be, for example, only the pixel values and position information of feature points derived by the matching unit 12 described later. As described above, by limiting the information to be recorded in the memory 18, it is possible to reduce the used memory capacity as compared with the case of storing the reference image Ir itself.
- the selection unit 11 When the target image Ic (n + 1) is input, the selection unit 11 refers to the memory 18 and selects the target image Ic (n) as the reference image Ir (k). As described above, the selection unit 11 selects one reference image Ir for each target image Ic. In one embodiment, the selection unit 11 may also select a temporary reference image for the target image Ic (n) when a predetermined condition is satisfied.
- the temporary reference image is an image selected from input images and is a temporary reference image. Details of the selection process of the temporary reference image will be described later.
- the selection unit 11 outputs image information related to the reference image Ir (information including at least pixel information and position information of feature points) to the matching unit 12.
- the matching unit 12 acquires the correspondence relationship between the reference image Ir and the target image Ic.
- the matching unit 12 acquires information on the feature points of the reference image Ir and the feature points of the target image Ic.
- the matching unit 12 acquires the correspondence between the reference image Ir and the target image Ic based on the pixel value of the feature point.
- a matching method a conventional method such as a block matching method can be used.
- the matching unit 12 may perform matching after making the reference image Ir and the target image Ic multi-resolution.
- the matching unit 12 changes the resolution of the reference image Ir and the target image Ic in stages, and generates a plurality of images having different resolutions.
- the matching part 12 may acquire the amount of parallel movement of a feature point between images with the smallest resolution, and may perform the matching of the feature point in pixels between images with a higher resolution. In this case, the processing speed can be increased and the calculation cost can be reduced.
- the matching unit 12 acquires the position information (coordinate information) of the feature point pair for which the correspondence relationship between the reference image Ir and the target image Ic is calculated. That is, the matching unit 12 acquires a pair of position information of a feature point of the reference image Ir and position information of a feature point of the target image Ic corresponding to the feature point. The matching unit 12 acquires a plurality of pairs of feature points for alignment processing described later. The matching unit 12 outputs the acquired feature point pair to the estimation unit 13. As described above, when the selection unit 11 adds the target image Ic as the reference image Ir for the next and subsequent times, the matching unit 12 sends the pixel values and position information of the feature points of the target image Ic to the selection unit 11. Output.
- the estimation unit 13 has a function of aligning the reference image Ir and the target image Ic based on the correspondence relationship between the reference image Ir and the target image Ic.
- FIG. 5 is a schematic diagram for explaining an outline of alignment between the reference image and the target image Ic. For example, as shown in FIG. 5A, when only the first image is input, the image is selected as the reference image Ir0. When the second image (target image Ic) is input, the estimation unit 13 aligns the target image Ic with the position of the reference image Ir0 as a reference. In the alignment, for example, as shown in FIG.
- a relative position between a predetermined point (here, center C0) of the reference image Ir0 and a predetermined point (here, center C1) of the target image Ic is determined. That is.
- the estimation unit 13 searches for a position where the feature point pairs acquired by the matching unit 12 overlap each other most. Then, as shown in FIG. 5C, when the estimation unit 13 completes the alignment of the reference image Ir0 and the target image Ic, the fact that the positional relationship is linked to each other (link Re1) is recorded. To do.
- the selection unit 11 sets the target image Ic to the next and subsequent times. Since it is necessary to add as the reference image Ir1, the matching unit 12 outputs the pixel value and position information of the feature point of the target image Ic to the selection unit 11.
- FIG. 6 is a schematic diagram for explaining an imaging surface by the rotation of the camera 20.
- the imaging surface of the camera 20 before rotation is S0 and the imaging surface of the rotated camera 20 is S1
- the imaging surface S0 and the imaging surface S1 do not exist in the same plane. Therefore, the position at which the feature point pairs are overlapped by translation is different from the original overlap position. That is, when positioning is performed, the position of the feature point of the reference image Ir and the position of the feature point of the target image Ic are aligned on the same three-dimensional coordinate system in consideration of the movement of the camera. It is necessary to match.
- the estimation unit 13 estimates a conversion formula that matches the three-dimensional coordinate system of the reference image Ir0 with the three-dimensional coordinate system of the target image Ic. As shown in FIG. 7, the coordinates of the feature points of the reference image Ir0 are (x 0 , y 0 , 1), and the coordinates of the feature points of the target image Ic corresponding to the feature points are (x 1 , y 1 , 1). Then, the estimation unit 13 estimates a conversion formula in which (x 0 , y 0 , 1) matches (x 1 , y 1 , 1).
- the degree of freedom of the camera is 1 degree of freedom for the focal length, 3 degrees of freedom for camera movement (parallel movement in the xy direction, enlargement / reduction by movement in the z direction), and rotation of the camera (x direction, y direction).
- the distortion of the image (trapezoid) and the rotation of the image in the z-axis are 3 degrees of freedom and 7 degrees of freedom, and considering the rolling shutter distortion (focal plane distortion) to be 8 degrees of freedom, It is expressed by the following formula 1.
- the parameters a 1 to h 1 of the conversion matrix (conversion formula) are parameters related to the above-described eight degrees of freedom.
- the estimation unit 13 obtains a parameter of a transformation matrix that allows a plurality of feature point pairs to satisfy the above relationship by convergence calculation using an optimization method. Specifically, the difference between the position (x 0 , y 0 , 1) of the feature point of the reference image Ir and the position after converting the position (x 1 , y 1 , 1) of the feature point of the target image Ic. Convergence calculation is performed so that the objective function including is the minimum value.
- the optimization method a known method such as Newton method or Gauss Newton method is adopted.
- the estimation unit 13 limits the movement of the camera 20 to three degrees of freedom, and position information of feature point pairs.
- the position of the feature point (x 0 , y 0 , 1) of the reference image Ir and the position of the feature point (x 1 , y 1 , 1) of the target image Ic are expressed by the following formula 2 using the transformation matrix R: It can be made to correspond as follows.
- (c x , c y ) is the respective center coordinates when the reference image Ir and the target image Ic have the same image size.
- F is a focal length.
- the focal length F may be a value obtained from the specification information of the camera 20.
- the estimation unit 13 projects the two-dimensional coordinates of the feature points on the spherical surface of the three-dimensional space Sp when the transformation matrix is estimated by performing the convergence calculation on Equation 2, and the correspondence relationship of the projected coordinates is expressed.
- FIG. 8 is a schematic diagram illustrating details of the alignment between the reference image Ir and the target image Ic. As shown in FIGS. 8A and 8B, the estimation unit 13 determines the position (x 0 , y 0 , 1) of the feature point of the reference image Ir in the two-dimensional coordinate system and the target image in the two-dimensional coordinate system. The position (x 1 , y 1 , 1) of Ic is perspectively projected onto the spherical surface of the three-dimensional space Sp.
- the coordinates after projection are (X n , Y n , Z n )
- the coordinates (x 1 , y 1 , F) Is projected and projected for example, according to Equation 3 below.
- the coordinate points after conversion by the conversion matrix R can be expressed as follows.
- the objective function of the convergence calculation includes the following difference r. Note that the conversion of Expression 3 can be omitted when it is assumed that the target image Ic is close to the reference image Ir. In this case, since the difference r does not include division, it is not necessary to consider the carry-over due to division.
- the calculation can be facilitated. Therefore, when calculating by projecting onto a spherical surface in a three-dimensional space, the calculation cost can be reduced.
- the estimation unit 13 estimates the transformation matrix R by the above processing, and aligns the reference image Ir and the target image Ic.
- the estimation unit 13 sequentially aligns the reference image Ir selected by the selection unit 11 and the input target image Ic to generate a link ((C) in FIG. 8).
- FIG. 9 shows a link in which positioning is performed between eight input images. As shown in FIG. 9, the centers C0 to C7 of the eight input images are linked (links Re1 to Re7).
- the estimation unit 13 estimates the transformation matrix R by projecting onto a spherical surface, and aligns the reference image and the target image. Therefore, when performing coordinate transformation between the two-dimensional plane and the spherical surface, for example, 8 shown in FIG. A degree of freedom of image deformation is considered. In other words, the estimating unit 13 can perform image deformation shown in FIG. 4 when the combining unit 14 described later projects from a spherical surface to a plane by positioning on the spherical surface.
- FIG. 10 is a schematic diagram illustrating the reference image Ir and the temporary reference image Itr.
- FIG. 10A when an image is input, the image becomes a reference image Ir for the next and subsequent times.
- a target image Ic separated from the reference image Ir by a predetermined value or more is input as shown in FIG.
- the target image Ic is set as a temporary reference image Itr that is a temporary reference image after the next time.
- the temporary reference image Itr is a temporary reference image that is not saved as a history.
- a target image Ic separated from the temporary reference image Itr is input as shown in (D) of FIG. 10, and the target image Ic separated from the temporary reference image Itr by a predetermined value or more as shown in (E) of FIG. Is entered.
- the temporary reference image Itr thus far is discarded, and the target image Ic is set as a temporary reference image Itr that is a temporary reference image from the next time onward.
- (H) of FIG. 10 it is assumed that the target image Ic separated from the reference image Ir as well as the temporary reference image Itr by a predetermined value or more is input.
- the temporary reference image Itr up to this time is discarded, and the target image Ic is set as a reference image Ir for the next time and thereafter.
- the reference image up to this time that is, the first reference image is Ir0, and the next and subsequent reference images are Ir1.
- Information about the feature points of the reference images Ir0 and Ir1 is stored for alignment.
- the target image Ic separated from the reference image Ir1 by a certain value or more is input.
- the target image Ic is set as a temporary reference image Itr that is a temporary reference image after the next time, as shown in FIG.
- the target image Ic closer to the reference image Ir0 than the temporary reference image Itr is input as shown in (L) of FIG.
- the temporary reference image Itr up to this time is discarded, and the next and subsequent reference images are set as the reference image Ir0.
- the target image can be aligned based on the past reference image even when the camera 20 returns to the original position.
- the temporary reference image Itr and the reference image Ir it is possible to minimize the data that needs to be recorded.
- the estimation part 13 may estimate the motion between several images simultaneously, when several images have overlapped. For example, as shown in FIG. 11, it is assumed that there is an image (past target image Ip1) that overlaps the reference image Ir and the target image Ic. Reference image Ir and past state of being positioned in the target image Ip1, i.e., the conversion matrix R 1 have already been derived. Then, the coordinates of the feature points of the reference image Ir are (x 0 , y 0 , 1), the coordinates of the feature points of the past target image Ip1 are (x 1 , y 1 , 1), and the coordinates of the feature points of the target image Ic. Is (x 2 , y 2 , 1).
- Expression 7 associates conversion expression R 1 and conversion expression R 2 .
- the estimation unit 13 simultaneously estimates R 1 and R 2 that can satisfy the above formulas 5 to 7 by convergence calculation using an optimization method. In this case, it is possible to avoid wasting information on the feature point pairs of the reference image Ir and the past target image Ip1. Further, by simultaneously estimating between a plurality of images, it is possible to suppress accumulation of errors compared to a case where links are connected in series.
- the estimation unit 13 when the target image Ic is close to the past reference image Ir, the estimation unit 13 performs alignment not only with the current reference image Ir but also with the past reference image Ir. For example, as shown in FIG. 13A, the target image Ic13 whose image center is C13 has a link Re13 and a reference image Ir12 whose image center is C12, and the relative position is determined.
- the estimation unit 13 aligns the reference image Ir1 and the target image Ic13 when the target image Ic13 is close to the past reference image Ir1 at the image center C1. Do. Thereby, the link Re14 is extended between the reference image Ir1 and the target image Ic13.
- the reference image Ir1 and the reference image Ir12 can be aligned using the target image Ic13. As described above, by achieving the alignment between the reference images Ir using the target image Ic, it is possible to position the reference images Ir that originally have little overlap.
- the estimation unit 13 may have a function of adjusting the overall position.
- the adjustment of the overall position is to adjust the overall positional relationship of the drawing target image (the image written in the output image buffer). For example, the entire drawing target image is updated at a timing when a new link is established between the reference images Ir or when a plurality of past transformation matrices are updated by executing simultaneous estimation of movements of a plurality of images.
- the position is finely adjusted. That is, the conversion matrix R of all drawing target images is recalculated.
- the overall position does not use the feature points output by the matching unit 12, but randomly extracts corresponding points between images based on the result of the alignment or extracts from a predetermined position.
- the entire alignment is performed based on the position information. In this case, since it is not necessary to hold a pair of past feature points, the memory usage can be reduced.
- the guide unit 15 has a function of guiding user operations.
- the guide unit 15 is connected to a display unit 21 that displays an image, and causes the display unit 21 to display a guide display that guides the user's camera operation.
- the guide unit 15 has a first hop count that is greater than or equal to a predetermined value and does not overlap with the current reference image Ir in the pair of the reference image Ir and the target image Ic whose relative positions have been determined. If an image exists, a guidance display for guiding the imaging position from the current imaging position to the image position of the first image is displayed on the display unit. For example, as illustrated in FIG.
- the guide unit 15 determines the number of hops (the number of links Re) between the image at the image center C0 (first image) and the current reference image at the image center C8. Count. Then, the guide unit 15 calculates the distance between the image at the image center C0 and the current reference image at the image center C8. Then, the guide unit 15 determines that the images are long in series when the count number is equal to or greater than the predetermined value and the distance is smaller than the predetermined value (for example, when they do not overlap). If images are long and connected in series, errors are likely to accumulate. Therefore, as shown in FIG. 13B, the guide unit 15 displays the guidance display Ga so that the imaging position of the camera 20 is directed toward the image (first image) at the image center C0.
- the guidance display Ga As the guidance display Ga, a frame, an arrow, an icon, or the like is used, and a voice may be given.
- a link is formed between the image at the image center C0 and the image at the image center C8, as shown in FIG. The position can be adjusted.
- the guide unit 15 avoids accumulation of errors by guiding the user.
- the composition unit 14 is connected to the display unit 21 and has a function of drawing a composite image on the display unit 21.
- the synthesizing unit 14 projects an image group (image to be drawn) aligned on the spherical surface in the three-dimensional space by the transformation matrix estimated by the estimating unit 13 onto a two-dimensional plane.
- image deformation shown in FIG. 4 is performed.
- the drawing target image is recorded in the output image buffer as a single composite image. For example, as shown in FIG. 14, it is assumed that the composite plane Sp is divided into a lattice pattern.
- the synthesizing unit 14 draws only the cells whose four corners are included in the image Id projected onto the synthesis plane Sp.
- the synthesizing unit 14 adjusts the blend ratio and the like at the boundary between images. In this way, a plurality of drawing target images Id are projected onto the synthesis plane Sp to generate a synthesized image.
- 15 and 16 are flowcharts showing the operation of the image processing apparatus 1 according to this embodiment.
- the control processing shown in FIGS. 15 and 16 is executed, for example, at the timing when the imaging function of the mobile terminal 2 is turned on, and is repeatedly executed at a predetermined cycle.
- the target image Ic is assumed to be the second and subsequent input images in consideration of ease of understanding.
- the image processing apparatus 1 executes an image input process (S10: input step).
- S10 image input process
- the input unit 10 inputs the target image Ic from the camera 20.
- the process of S10 ends, the process proceeds to the alignment process (S12).
- the selection unit 11 selects the reference image Ir from the input image (S30: selection step).
- the matching unit 12 and the estimation unit 13 align the reference image Ir and the target image Ic (S32: matching step and estimation step).
- the estimation unit 13 determines whether or not the past (or other) reference image Ir and the target image Ic can be compared (S34). In the process of S34, when the estimation unit 13 determines that the comparison is possible, the past reference image Ir and the target image Ic are aligned (S36, for example, FIG. 12B).
- the estimation unit 13 sets a redraw flag to 1 for adjusting the overall position (S38).
- the selection unit 11 determines whether or not the distance between the target image Ic and the past reference image Ir is a predetermined value or more (S40).
- the target image Ic is recorded in the reference image list to be the next reference image Ir (S42).
- the reference image list is a list for referring to data in which pixel values and coordinates of feature points of the reference image are recorded.
- the process of S34 when the estimation unit 13 determines that the comparison is not possible, the process proceeds to a process of determining the distance between the target image Ic and the past reference image Ir (S40).
- the process of S40 when the selection unit 11 determines that the distance is not greater than or equal to the predetermined value, the alignment process illustrated in FIG.
- the synthesis unit 14 determines whether or not to add the target image Ic input in the process of S10 as a drawing image.
- the synthesizing unit 14 is configured to be able to refer to a drawing image list that can refer to image information of an image to be drawn, and the distance from the closest image among the images described in the list Is equal to or greater than the predetermined value, the target image Ic input in the process of S10 is added to the drawing image. If it is determined in the process of S14 that it is an addition, the process proceeds to a storage process (S16).
- the composition unit 14 adds the target image Ic to the drawing list and stores the target image Ic.
- the process proceeds to a redraw determination process (S18).
- the estimation unit 13 performs the entire alignment process.
- the estimation unit 13 adjusts the positions of all the drawing images using the updated conversion matrix.
- the process of S20 ends, the process proceeds to a preview image drawing process (S22).
- the synthesizing unit 14 specifies an image to be drawn from, for example, the drawing image list, and generates a preview synthesized image by projecting from the spherical surface of the three-dimensional space onto the two-dimensional plane (S22: synthesizing step). ). Thereafter, a preview image is output and displayed on the display unit 21 or the like (S24). When the processing of S24 is completed, the routine proceeds to image input determination processing (S26).
- the input unit 10 determines whether or not the input of the target image Ic has been completed. If the input of the target image Ic is not completed in the process of S26, the process proceeds to S10 again. On the other hand, when the input of the target image Ic is completed, the process proceeds to a result image output process (S28).
- the synthesis unit 14 displays the synthesized image on the display unit 21 or the like.
- the control process shown in FIGS. 15 and 16 is completed.
- the image processing program includes a main module, an input module, and an arithmetic processing module.
- the main module is a part that comprehensively controls image processing.
- the input module operates the mobile terminal 2 so as to acquire an input image.
- the arithmetic processing module includes a selection module, a matching module, an estimation module, a synthesis module, and a guidance module. Functions realized by executing the main module, the input module, and the arithmetic processing module are the input unit 10, the selection unit 11, the matching unit 12, the estimation unit 13, the synthesis unit 14, and the guide unit 15 of the image processing apparatus 1 described above. These functions are the same.
- the image processing program is provided by a recording medium such as a ROM or a semiconductor memory, for example.
- the image processing program may be provided as a data signal via a network.
- the image processing device 1 As described above, according to the image processing device 1, the image processing method, and the image processing program according to the present embodiment, it is assumed that the movement between the reference image Ir and the target image Ic is caused only by the rotational movement of the camera 20.
- a transformation matrix R to which the system is associated is estimated.
- the transformation matrix R does not include parameters such as enlargement, reduction, and parallel movement, so that it is possible to avoid the occurrence of an error due to the input target image Ic being reduced.
- the rotation component it is possible to avoid the target image Ic that has been reduced or the like as the reference image Ir for the next and subsequent times, so that accumulation of errors can be avoided. Therefore, even when the input images are sequentially stitched together, even if images with different imaging orientations are included, accumulation of errors can be suppressed and a high-quality panoramic image can be obtained.
- the above-described embodiment shows an example of the image processing apparatus according to the present invention.
- the image processing apparatus according to the present invention is not limited to the image processing apparatus 1 according to the embodiment, and the image processing apparatus according to the embodiment may be modified or otherwise changed without changing the gist described in each claim. It may be applied to the above.
- the camera 20 may capture a moving image.
- the input unit 10 may have a function of extracting continuous images from the captured moving image.
- the image input by the input unit 10 may be an image transmitted from another device via a network.
- the size of the image captured by the camera 20 has been described as being the same. However, the size of the captured image may be different for each imaging.
- the case where the input unit 10, the selection unit 11, the matching unit 12, the estimation unit 13, the synthesis unit 14, and the guide unit 15 are provided has been described. Also good.
- the guide unit 15 may not be provided as necessary.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (13)
- 撮像素子によって撮像された画像を順次入力し、入力の度につなぎ合わせて合成画像を生成する画像処理装置であって、
前記画像を順次入力する入力部と、
前記入力部によって新たに入力された処理対象の画像である対象画像よりも前に前記入力部によって入力された1又は複数の画像からなる入力済画像の中から基準画像を選択する選択部と、
前記基準画像の特徴点と前記対象画像の特徴点との対応関係を算出するマッチング部と、
前記基準画像と前記対象画像との間の動きが前記撮像素子の回転運動のみであるとして、前記マッチング部によって前記対応関係が算出された前記特徴点のペアの位置情報を用いて前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式を推定する推定部と、
前記変換式に基づいて、前記基準画像と前記対象画像とを繋ぎ合わせて前記合成画像を生成する合成部と、
を備える画像処理装置。 - 前記推定部は、前記撮像素子の位置を原点とした三次元座標系の各軸の回転成分のみからなる変換式を推定する請求項1に記載の画像処理装置。
- 前記推定部は、前記変換式を用いて変換された前記特徴点のペアのそれぞれの位置情報の差分を含む目的関数を用意し、最適化手法を用いて前記目的関数が最小値となるように収束演算することで前記変換式を推定する請求項1又は2に記載の画像処理装置。
- 前記推定部は、前記対象画像よりも前に前記入力部によって入力された画像が複数存在する場合には、収束演算の初期値として、前記対象画像の直前に前記入力部によって入力された画像の前記変換式を採用する請求項3に記載の画像処理装置。
- 前記推定部は、前記マッチング部によって前記対応関係が算出された前記特徴点のペアを球面に投影し、投影後の前記特徴点のペアの位置情報を用いて前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式を推定する請求項3に記載の画像処理装置。
- 前記推定部は、前記対象画像よりも前に前記入力部によって入力された画像が複数存在し、かつ、前記対象画像よりも前に前記入力部によって入力された画像の中の少なくとも1つが前記基準画像及び前記対象画像と重なる場合には、前記基準画像、前記基準画像に重なる画像及び前記対象画像のそれぞれの特徴点のペアを用いて、前記基準画像の座標系と前記基準画像に重なる画像の座標系とを対応させる変換式、及び、前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式を、互いに関連づけて推定する請求項1~5の何れか一項に記載の画像処理装置。
- 前記選択部は、前記基準画像と前記対象画像との距離が所定値以上である場合には、当該対象画像を前記入力部によって次回以降新たに入力される処理対象の画像である対象画像の前記基準画像として選択する請求項1~6の何れか一項に記載の画像処理装置。
- 前記選択部は、前記対象画像と前記基準画像との距離よりも前記対象画像と過去の前記基準画像との距離の方が小さい場合には、過去の基準画像を次回以降新たに入力される処理対象の画像である対象画像の前記基準画像として選択する請求項7に記載の画像処理装置。
- 前記マッチング部は、前記対象画像と過去の前記基準画像との距離が所定値以下の場合には、過去の前記基準画像の特徴点と前記対象画像の特徴点との対応関係をさらに算出し、
前記推定部は、過去の前記基準画像と前記対象画像との前記特徴点のペアの位置情報を用いて、過去の前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式をさらに推定し、
前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式、過去の前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式を用いて、前記基準画像と過去の前記基準画像とを対応させる請求項8に記載の画像処理装置。 - 画像を表示する表示部に接続され、ユーザのカメラ操作を誘導する案内表示を前記表示部に表示させる案内部をさらに備え、
前記推定部は、前記変換式を推定した前記基準画像と前記対象画像とをリンクさせ、相対位置を決定済みのペアであるとして記録し、
前記合成部は、前記合成画像を前記表示部へ出力し、
前記案内部は、前記相対位置を決定済みのペアのうち今回の前記基準画像とのホップ数が所定値以上でかつ今回の前記基準画像と重ならない第1画像が存在する場合には、現在の撮像位置から前記第1画像の画像位置へ撮像位置を誘導する前記案内表示を前記表示部へ表示させる請求項1~9の何れか一項に記載の画像処理装置。 - 撮像素子によって撮像された画像を順次入力し、入力の度につなぎ合わせて合成画像を生成する画像処理方法であって、
前記画像を順次入力する入力ステップと、
前記入力ステップによって新たに入力された処理対象の画像である対象画像よりも前に前記入力ステップによって入力された1又は複数の画像からなる入力済画像の中から基準画像を選択する選択ステップと、
前記基準画像の特徴点と前記対象画像の特徴点との対応関係を算出するマッチングステップと、
前記基準画像と前記対象画像との間の動きが前記撮像素子の回転運動のみであるとして、前記マッチングステップによって前記対応関係が算出された前記特徴点のペアの位置情報を用いて前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式を推定する推定ステップと、
前記変換式に基づいて、前記基準画像と前記対象画像とを繋ぎ合わせて前記合成画像を生成する合成ステップと、
を備える画像処理方法。 - 撮像素子によって撮像された画像を順次入力し、入力の度につなぎ合わせて合成画像を生成するようにコンピュータを機能させる画像処理プログラムであって、
前記コンピュータを、
前記画像を順次入力する入力部、
前記入力部によって新たに入力された処理対象の画像である対象画像よりも前に前記入力部によって入力された1又は複数の画像からなる入力済画像の中から基準画像を選択する選択部、
前記基準画像の特徴点と前記対象画像の特徴点との対応関係を算出するマッチング部、
前記基準画像と前記対象画像との間の動きが前記撮像素子の回転運動のみであるとして、前記マッチング部によって前記対応関係が算出された前記特徴点のペアの位置情報を用いて前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式を推定する推定部、及び、
前記変換式に基づいて、前記基準画像と前記対象画像とを繋ぎ合わせて前記合成画像を生成する合成部として機能させる画像処理プログラム。 - 撮像素子によって撮像された画像を順次入力し、入力の度につなぎ合わせて合成画像を生成するようにコンピュータを機能させる画像処理プログラムを記録したコンピュータ読取可能な記録媒体であって、
前記コンピュータを、
前記画像を順次入力する入力部、
前記入力部によって新たに入力された処理対象の画像である対象画像よりも前に前記入力部によって入力された1又は複数の画像からなる入力済画像の中から基準画像を選択する選択部、
前記基準画像の特徴点と前記対象画像の特徴点との対応関係を算出するマッチング部、
前記基準画像と前記対象画像との間の動きが前記撮像素子の回転運動のみであるとして、前記マッチング部によって前記対応関係が算出された前記特徴点のペアの位置情報を用いて前記基準画像の座標系と前記対象画像の座標系とを対応させる変換式を推定する推定部、及び、
前記変換式に基づいて、前記基準画像と前記対象画像とを繋ぎ合わせて前記合成画像を生成する合成部として機能させる前記画像処理プログラムを記録した記録媒体。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/233,294 US10136054B2 (en) | 2012-10-24 | 2012-10-24 | Image processing device for compositing panoramic images, image processing program and recording medium |
KR1020167036356A KR101819851B1 (ko) | 2012-10-24 | 2012-10-24 | 화상 처리 장치, 화상 처리 방법, 및 기록 매체 |
CN201280072705.3A CN104272344B (zh) | 2012-10-24 | 2012-10-24 | 图像处理装置以及图像处理方法 |
PCT/JP2012/077491 WO2014064783A1 (ja) | 2012-10-24 | 2012-10-24 | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 |
CN201710531324.1A CN107256532B (zh) | 2012-10-24 | 2012-10-24 | 图像处理装置、图像处理方法以及记录介质 |
JP2013513456A JP5493114B1 (ja) | 2012-10-24 | 2012-10-24 | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 |
KR1020157011097A KR101692652B1 (ko) | 2012-10-24 | 2012-10-24 | 화상 처리 장치, 화상 처리 방법, 및 기록 매체 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/077491 WO2014064783A1 (ja) | 2012-10-24 | 2012-10-24 | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014064783A1 true WO2014064783A1 (ja) | 2014-05-01 |
Family
ID=50544181
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/077491 WO2014064783A1 (ja) | 2012-10-24 | 2012-10-24 | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10136054B2 (ja) |
JP (1) | JP5493114B1 (ja) |
KR (2) | KR101692652B1 (ja) |
CN (2) | CN104272344B (ja) |
WO (1) | WO2014064783A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9277122B1 (en) * | 2015-08-13 | 2016-03-01 | Legend3D, Inc. | System and method for removing camera rotation from a panoramic video |
JP2017120653A (ja) * | 2015-01-19 | 2017-07-06 | 株式会社リコー | 線形パノラマ画像連結のためのプレビュー画像取得ユーザインタフェース |
US10713828B2 (en) | 2016-02-02 | 2020-07-14 | Morpho, Inc. | Image processing device, image processing method, non-transitory computer readable recording medium and photographing assist equipment |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3039617B1 (en) | 2013-08-31 | 2020-05-20 | ML Netherlands C.V. | User feedback for real-time checking and improving quality of scanned image |
EP3540683A1 (en) | 2013-12-03 | 2019-09-18 | ML Netherlands C.V. | User feedback for real-time checking and improving quality of scanned image |
WO2015104235A1 (en) | 2014-01-07 | 2015-07-16 | Dacuda Ag | Dynamic updating of composite images |
EP3748953B1 (en) * | 2014-01-07 | 2024-04-17 | ML Netherlands C.V. | Adaptive camera control for reducing motion blur during real-time image capture |
JP6452440B2 (ja) * | 2014-12-26 | 2019-01-16 | 任天堂株式会社 | 画像表示システム、画像表示装置、画像表示方法、およびプログラム |
KR102468086B1 (ko) * | 2015-11-06 | 2022-11-17 | 삼성전자주식회사 | 컨텐츠 표시 방법 및 이를 구현한 전자 장치 |
JP6604908B2 (ja) | 2016-06-10 | 2019-11-13 | キヤノン株式会社 | 画像処理装置、その制御方法、および制御プログラム |
EP3515317B1 (en) * | 2016-09-20 | 2020-05-20 | Koninklijke Philips N.V. | Ultrasound transducer tile registration |
KR20180051288A (ko) * | 2016-11-08 | 2018-05-16 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
US10410362B2 (en) * | 2016-11-14 | 2019-09-10 | Htc Corporation | Method, device, and non-transitory computer readable storage medium for image processing |
US10453204B2 (en) * | 2016-12-06 | 2019-10-22 | Adobe Inc. | Image alignment for burst mode images |
JP6699902B2 (ja) * | 2016-12-27 | 2020-05-27 | 株式会社東芝 | 画像処理装置及び画像処理方法 |
US10616551B2 (en) * | 2017-01-27 | 2020-04-07 | OrbViu Inc. | Method and system for constructing view from multiple video streams |
CN108965687B (zh) * | 2017-05-22 | 2021-01-29 | 阿里巴巴集团控股有限公司 | 拍摄方向识别方法、服务器及监控方法、系统及摄像设备 |
JP2019012360A (ja) * | 2017-06-29 | 2019-01-24 | キヤノン株式会社 | 情報処理装置、プログラム及び情報処理方法 |
CN107945204B (zh) * | 2017-10-27 | 2021-06-25 | 西安电子科技大学 | 一种基于生成对抗网络的像素级人像抠图方法 |
JP7118729B2 (ja) * | 2018-05-11 | 2022-08-16 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
CN109074757B (zh) * | 2018-07-03 | 2021-11-09 | 达闼机器人有限公司 | 一种建立地图的方法、终端和计算机可读存储介质 |
US10719944B2 (en) * | 2018-09-13 | 2020-07-21 | Seiko Epson Corporation | Dynamic object tracking |
US10986287B2 (en) | 2019-02-19 | 2021-04-20 | Samsung Electronics Co., Ltd. | Capturing a photo using a signature motion of a mobile device |
CN110232654A (zh) * | 2019-04-24 | 2019-09-13 | 薄涛 | 图像合成方法、装置、设备及其存储介质 |
FI20196125A1 (en) * | 2019-12-23 | 2021-06-24 | Truemed Oy | A method for identifying the authenticity of an object |
KR20210133472A (ko) * | 2020-04-29 | 2021-11-08 | 삼성전자주식회사 | 이미지 병합 방법 및 이를 수행하는 데이터 처리 장치 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1173492A (ja) * | 1996-11-08 | 1999-03-16 | Ricoh Co Ltd | 座標変換方法、画像合成方法及び情報記録媒体 |
JP2002042125A (ja) * | 2000-07-26 | 2002-02-08 | Minolta Co Ltd | 画像合成装置、画像合成方法、および、画像合成プログラムを記録したコンピュータ読取可能な記録媒体 |
JP2009060278A (ja) * | 2007-08-30 | 2009-03-19 | Olympus Imaging Corp | カメラ及びこれに適用されるパノラマ撮影ガイド表示方法,パノラマ撮影ガイド表示プログラム |
JP2011104137A (ja) * | 2009-11-18 | 2011-06-02 | Aloka Co Ltd | 超音波診断システム |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5977977A (en) * | 1995-08-04 | 1999-11-02 | Microsoft Corporation | Method and system for multi-pass rendering |
US6075905A (en) * | 1996-07-17 | 2000-06-13 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
US6104840A (en) * | 1996-11-08 | 2000-08-15 | Ricoh Company, Ltd. | Method and system for generating a composite image from partially overlapping adjacent images taken along a plurality of axes |
US7092012B2 (en) * | 1996-11-15 | 2006-08-15 | Canon Kabushiki Kaisha | Image processing apparatus and method, storage medium, and communication system |
US6078701A (en) * | 1997-08-01 | 2000-06-20 | Sarnoff Corporation | Method and apparatus for performing local to global multiframe alignment to construct mosaic images |
US6657667B1 (en) * | 1997-11-25 | 2003-12-02 | Flashpoint Technology, Inc. | Method and apparatus for capturing a multidimensional array of overlapping images for composite image generation |
US6891561B1 (en) * | 1999-03-31 | 2005-05-10 | Vulcan Patents Llc | Providing visual context for a mobile active visual display of a panoramic region |
JP2001104137A (ja) | 1999-10-07 | 2001-04-17 | Denso Corp | 皿用idタグ |
JP4169464B2 (ja) | 1999-12-28 | 2008-10-22 | 株式会社リコー | 画像処理方法と画像処理装置及びコンピュータ読み取り可能な記録媒体 |
US7656429B2 (en) * | 2004-02-04 | 2010-02-02 | Hewlett-Packard Development Company, L.P. | Digital camera and method for in creating still panoramas and composite photographs |
US7424218B2 (en) * | 2005-07-28 | 2008-09-09 | Microsoft Corporation | Real-time preview for panoramic images |
US7639897B2 (en) * | 2006-01-24 | 2009-12-29 | Hewlett-Packard Development Company, L.P. | Method and apparatus for composing a panoramic photograph |
US7995861B2 (en) * | 2006-12-13 | 2011-08-09 | Adobe Systems Incorporated | Selecting a reference image for images to be joined |
KR100800804B1 (ko) * | 2006-12-27 | 2008-02-04 | 삼성전자주식회사 | 파노라마 영상 촬영 방법 |
KR100869952B1 (ko) * | 2007-02-14 | 2008-11-24 | 삼성전자주식회사 | 파노라마 사진 촬영 방법 및 장치 |
CN101304515A (zh) * | 2007-05-11 | 2008-11-12 | 徐世刚 | 全景式倒车引导系统 |
EP2158576A1 (en) * | 2007-06-08 | 2010-03-03 | Tele Atlas B.V. | Method of and apparatus for producing a multi-viewpoint panorama |
US8717412B2 (en) * | 2007-07-18 | 2014-05-06 | Samsung Electronics Co., Ltd. | Panoramic image production |
JP4377932B2 (ja) * | 2007-07-26 | 2009-12-02 | 株式会社モルフォ | パノラマ画像生成装置およびプログラム |
JP5144237B2 (ja) * | 2007-12-05 | 2013-02-13 | キヤノン株式会社 | 画像処理装置、その制御方法、プログラム |
JP5223318B2 (ja) * | 2007-12-07 | 2013-06-26 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP4720859B2 (ja) * | 2008-07-09 | 2011-07-13 | カシオ計算機株式会社 | 画像処理装置、画像処理方法およびプログラム |
CN101656840B (zh) * | 2008-08-22 | 2011-09-28 | 原相科技股份有限公司 | 广角传感器阵列模块及其图像校正方法、操作方法与应用 |
WO2010025309A1 (en) * | 2008-08-28 | 2010-03-04 | Zoran Corporation | Robust fast panorama stitching in mobile phones or cameras |
IL193906A (en) * | 2008-09-04 | 2012-06-28 | Pro Track Ltd | Methods and systems for creating an aligned bank of images with an iterative self-correction technique for coordinate acquisition and object detection |
US10080006B2 (en) * | 2009-12-11 | 2018-09-18 | Fotonation Limited | Stereoscopic (3D) panorama creation on handheld device |
JP5163676B2 (ja) * | 2010-03-19 | 2013-03-13 | カシオ計算機株式会社 | 撮像装置、撮像方法及びプログラム |
US8428390B2 (en) * | 2010-06-14 | 2013-04-23 | Microsoft Corporation | Generating sharp images, panoramas, and videos from motion-blurred videos |
CN101963751B (zh) | 2010-08-19 | 2011-11-30 | 西北工业大学 | 高分辨率实时全景高动态范围图像获取装置及方法 |
CN101984463A (zh) | 2010-11-02 | 2011-03-09 | 中兴通讯股份有限公司 | 全景图合成方法及装置 |
KR101106567B1 (ko) * | 2011-09-29 | 2012-01-19 | 한국종합설계 주식회사 | 도화이미지의 부분수정이 가능한 편집용 영상도화 처리시스템 |
JP5522545B2 (ja) * | 2011-10-18 | 2014-06-18 | カシオ計算機株式会社 | 撮像装置、撮像方法、及びプログラム |
US9516223B2 (en) * | 2012-06-06 | 2016-12-06 | Apple Inc. | Motion-based image stitching |
US9325861B1 (en) * | 2012-10-26 | 2016-04-26 | Google Inc. | Method, system, and computer program product for providing a target user interface for capturing panoramic images |
-
2012
- 2012-10-24 KR KR1020157011097A patent/KR101692652B1/ko active IP Right Grant
- 2012-10-24 JP JP2013513456A patent/JP5493114B1/ja active Active
- 2012-10-24 KR KR1020167036356A patent/KR101819851B1/ko active IP Right Grant
- 2012-10-24 WO PCT/JP2012/077491 patent/WO2014064783A1/ja active Application Filing
- 2012-10-24 CN CN201280072705.3A patent/CN104272344B/zh active Active
- 2012-10-24 US US14/233,294 patent/US10136054B2/en active Active
- 2012-10-24 CN CN201710531324.1A patent/CN107256532B/zh active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1173492A (ja) * | 1996-11-08 | 1999-03-16 | Ricoh Co Ltd | 座標変換方法、画像合成方法及び情報記録媒体 |
JP2002042125A (ja) * | 2000-07-26 | 2002-02-08 | Minolta Co Ltd | 画像合成装置、画像合成方法、および、画像合成プログラムを記録したコンピュータ読取可能な記録媒体 |
JP2009060278A (ja) * | 2007-08-30 | 2009-03-19 | Olympus Imaging Corp | カメラ及びこれに適用されるパノラマ撮影ガイド表示方法,パノラマ撮影ガイド表示プログラム |
JP2011104137A (ja) * | 2009-11-18 | 2011-06-02 | Aloka Co Ltd | 超音波診断システム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017120653A (ja) * | 2015-01-19 | 2017-07-06 | 株式会社リコー | 線形パノラマ画像連結のためのプレビュー画像取得ユーザインタフェース |
US9277122B1 (en) * | 2015-08-13 | 2016-03-01 | Legend3D, Inc. | System and method for removing camera rotation from a panoramic video |
WO2017027884A1 (en) * | 2015-08-13 | 2017-02-16 | Legend3D, Inc. | System and method for removing camera rotation from a panoramic video |
US10713828B2 (en) | 2016-02-02 | 2020-07-14 | Morpho, Inc. | Image processing device, image processing method, non-transitory computer readable recording medium and photographing assist equipment |
Also Published As
Publication number | Publication date |
---|---|
KR101819851B1 (ko) | 2018-01-17 |
CN107256532A (zh) | 2017-10-17 |
US20150229840A1 (en) | 2015-08-13 |
KR20170002693A (ko) | 2017-01-06 |
KR101692652B1 (ko) | 2017-01-03 |
CN107256532B (zh) | 2020-09-18 |
KR20150065778A (ko) | 2015-06-15 |
CN104272344A (zh) | 2015-01-07 |
JPWO2014064783A1 (ja) | 2016-09-05 |
CN104272344B (zh) | 2017-07-14 |
US10136054B2 (en) | 2018-11-20 |
JP5493114B1 (ja) | 2014-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014064783A1 (ja) | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 | |
JP4196216B2 (ja) | 画像合成システム、画像合成方法及びプログラム | |
US10452945B2 (en) | Image generating device, electronic device, image generating method and recording medium | |
JP4620607B2 (ja) | 画像処理装置 | |
JP6100089B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP4941950B1 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP4377932B2 (ja) | パノラマ画像生成装置およびプログラム | |
WO2014068779A1 (ja) | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 | |
US20090225174A1 (en) | Image processing apparatus, image processing method, hand shake blur area estimation device, hand shake blur area estimation method, and program | |
US20120269444A1 (en) | Image compositing apparatus, image compositing method and program recording device | |
JP5493112B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP5022498B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP2012003503A (ja) | 画像処理装置およびその制御方法、並びにプログラム | |
JP5687370B2 (ja) | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 | |
JP4128123B2 (ja) | 手ぶれ補正装置、手ぶれ補正方法および手ぶれ補正プログラムを記録したコンピュータ読み取り可能な記録媒体 | |
JP4930304B2 (ja) | 画像処理装置、画像処理方法、プログラム、及び記録媒体 | |
US11106042B2 (en) | Image processing apparatus, head-mounted display, and image displaying method | |
JP5928228B2 (ja) | 被写体検出装置、被写体検出方法及びプログラム | |
JP2013042213A (ja) | 画像処理装置および画像処理方法、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013513456 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14233294 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12887273 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20157011097 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12887273 Country of ref document: EP Kind code of ref document: A1 |