WO2012049768A1 - 画像処理装置、画像処理方法及び画像処理プログラム - Google Patents
画像処理装置、画像処理方法及び画像処理プログラム Download PDFInfo
- Publication number
- WO2012049768A1 WO2012049768A1 PCT/JP2010/068162 JP2010068162W WO2012049768A1 WO 2012049768 A1 WO2012049768 A1 WO 2012049768A1 JP 2010068162 W JP2010068162 W JP 2010068162W WO 2012049768 A1 WO2012049768 A1 WO 2012049768A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- center point
- point
- center
- image processing
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 95
- 238000003672 processing method Methods 0.000 title claims description 14
- 239000002131 composite material Substances 0.000 claims abstract description 82
- 230000006870 function Effects 0.000 claims description 35
- 239000013598 vector Substances 0.000 claims description 18
- 238000000034 method Methods 0.000 description 47
- 230000015572 biosynthetic process Effects 0.000 description 19
- 238000003786 synthesis reaction Methods 0.000 description 19
- 230000002194 synthesizing effect Effects 0.000 description 19
- 238000004364 calculation method Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 239000000203 mixture Substances 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 7
- 238000011156 evaluation Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3876—Recombination of partial images to recreate the original image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the present invention relates to an image processing apparatus, an image processing method, and an image processing program.
- Patent Document 1 an apparatus that creates a panoramic still image that is a single wide-angle still image by connecting captured images.
- the image processing apparatus described in Patent Document 1 positions and positions a small region in a region where the first image and the second image overlap with each other. The overlapping area is cut out from one image, and the image from which the overlapping area is cut out is connected to the other image to create a panoramic still image.
- the present invention has been made to solve such a technical problem, and is an image processing device capable of improving the quality of a synthesized image while reducing the load of the synthesis processing of a plurality of images,
- An object is to provide an image processing method and an image processing program.
- the image processing apparatus joins a first image configured by joining one image or a plurality of the images and an input second image each time the second image is input.
- An image processing apparatus that sequentially generates a composite image, the position information of a first center point that is a center point of each of the images constituting the first image, and a second center point that is a center point of the second image
- a center position acquisition unit that acquires position information; and the first center point of an image that overlaps the second image among images constituting the first image; and the acquired position information of the first center point; Based on the position information of the second center point, the obtained perpendicular bisector between the first center point and the second center point is joined as a joint between the first image and the second image, and the composition is performed.
- the center position acquisition unit acquires the position information of the first center point of each of the images constituting the first image and the second center point of the second image
- the composite image generation unit A first center point of an image that overlaps the second image among images constituting the first image is acquired, and the acquired first center point is acquired based on the acquired position information of the first center point and the position information of the second center point.
- a perpendicular bisector between the first center point and the second center point is connected as a joint between the first image and the second image, and a composite image is generated.
- the first image may be an image input immediately before the second image or the composite image generated by the composite image generation unit.
- the center position acquisition unit acquires a motion vector based on the first image and the second image, and acquires the position information based on the acquired motion vector.
- the composite image generation unit determines a pixel value at a predetermined position of the composite image based on a distance from a perpendicular bisector between the first center point and the second center point closest to the predetermined position. It is preferred to determine. With this configuration, the pixel value at a predetermined position of the composite image can be determined based on the distance information from the vertical bisector, so that the image can be combined with simple arithmetic processing.
- the composite image generation unit calculates the pixel value of the first image when the distance is greater than a predetermined value and the predetermined position is closer to the first center point than the second center point.
- the pixel value of the second image is set to the predetermined position.
- the pixel value of the first image and the pixel value of the second image are preferably combined to obtain the pixel value at the predetermined position.
- the composite image generating unit records the first center point closest to the lattice point for each lattice point, with the predetermined position being a position at a lattice point arranged in a lattice pattern.
- the center point of all the images constituting the first image is determined for each pixel in the region where the first image and the second image overlap. Since there is no need to compare the center point of the second image, it is possible to reduce processing time or processing cost.
- the composite image generation unit determines a pixel value in a block surrounded by the grid points based on the distance obtained for each grid point.
- the composite image generation unit is configured such that the distances of all the lattice points surrounding the block are larger than a predetermined value, and the positions of all the lattice points surrounding the block are the second center points.
- the pixel value of the first image is set as the pixel value in the block
- the distances of all the grid points surrounding the block are larger than a predetermined value
- all the values surrounding the block When the position of the grid point is closer to the second center point than the first center point, it is preferable that the pixel value of the second image is the pixel value in the block.
- the composite image generation unit updates the nearest first center point recorded for each lattice point after generating the composite image.
- the image processing method joins a first image configured by joining one image or a plurality of the images and an input second image each time the second image is input.
- An image processing method for sequentially generating a composite image, the position information of a first center point being a center point of each of the images constituting the first image, and a second center point being a center point of the second image A center position acquisition step of acquiring position information; acquiring the first center point of an image that overlaps the second image among images constituting the first image; and the acquired position information of the first center point; Based on the position information of the second center point, the obtained perpendicular bisector between the first center point and the second center point is joined as a joint between the first image and the second image, and the composition is performed.
- Composite image generation step for generating images Configured with the, and.
- the image processing method according to the present invention has the same effects as those of the image processing apparatus of the present invention described above.
- the image processing program causes a computer to input a first image composed of one image or a plurality of the images and an input second image each time the second image is input.
- An image processing program that functions to sequentially generate a composite image by stitching together, the position information of a first center point that is the center point of each image constituting the first image, and the center point of the second image
- a center position acquisition unit that acquires position information of the second center point, and the first center point of an image that overlaps the second image among images constituting the first image, and the acquired first center point
- the obtained perpendicular bisector between the first center point and the second center point is the first image and the second image.
- the image processing program according to the present invention has the same effects as those of the image processing apparatus of the present invention described above.
- the present invention it is possible to improve the quality of the synthesized image while reducing the load of the synthesis processing of a plurality of images.
- the image processing apparatus is an apparatus that sequentially creates a single image by joining input images every time it is input. For example, a plurality of continuously captured images are joined in real time. This is suitably employed when generating a panoramic image having a wider angle than the captured image.
- the image processing apparatus according to the present embodiment is preferably mounted on a mobile terminal with limited resources such as a mobile phone, a digital camera, and a PDA (Personal Digital Assistant), but is not limited thereto. For example, it may be mounted on a normal computer system.
- a mobile terminal having a camera function will be described as an example of the image processing apparatus according to the present invention in consideration of ease of understanding.
- FIG. 1 is a functional block diagram of a mobile terminal 2 including an image processing apparatus 1 according to the present embodiment.
- a mobile terminal 2 shown in FIG. 1 is a mobile terminal carried by a user, for example, and has a hardware configuration shown in FIG.
- FIG. 2 is a hardware configuration diagram of the mobile terminal 2.
- the portable terminal 2 physically includes a main storage device such as a CPU (Central Processing Unit) 100, a ROM (Read Only Memory) 101, and a RAM (Random Access Memory) 102, a camera, a keyboard, and the like.
- the input device 103, the output device 104 such as a display, the auxiliary storage device 105 such as a hard disk, and the like are configured as a normal computer system.
- Each function of the portable terminal 2 and the image processing apparatus 1 to be described later causes the input device 103 and the output device 104 to be controlled under the control of the CPU 100 by reading predetermined computer software on hardware such as the CPU 100, the ROM 101, and the RAM 102. This is realized by operating and reading and writing data in the main storage device and the auxiliary storage device 105.
- the image processing apparatus 1 normally includes a CPU 100, a main storage device such as the ROM 101 and the RAM 102, an input device 103, an output device 104, an auxiliary storage device 105, and the like. It may be configured as a computer system.
- the mobile terminal 2 may include a communication module or the like.
- the mobile terminal 2 includes a camera 30, an image processing device 1, and a display unit 31.
- the camera 30 has a function of capturing an image.
- an image sensor or the like is used as the camera 30.
- the camera 30 has a continuous imaging function that repeatedly captures images at a predetermined interval from a timing specified by a user operation or the like, for example.
- the user can slide the camera 30 to capture continuous images that overlap at least vertically and horizontally.
- the camera 30 has a function of outputting a captured image to the image processing apparatus 1 every time it is captured.
- the image processing apparatus 1 includes an image input unit 10, a center position acquisition unit 11, a composite image generation unit 12, and a center position storage unit 13.
- the image input unit 10 has a function of inputting an image captured by the camera 30.
- the image input unit 10 has a function of inputting, for example, an image captured by the camera 30 every time it is captured.
- the image input unit 10 also has a function of saving the first input image in a first temporary storage area provided in the mobile terminal 2.
- the image input unit 10 has a function of saving images input continuously from the next time in a second temporary storage area provided in the mobile terminal.
- the second temporary storage area is updated each time a new image is input, and the first temporary storage area is overwritten and saved with an image (intermediate composite image) that is sequentially combined each time an image is input.
- the image stored in the first temporary storage area is described as a first image
- the image stored in the second temporary storage area is described as a second image.
- the center position acquisition unit 11 has a function of acquiring the position information of the center point of the image (the initial first image or the second image) input by the image input unit 10.
- the center point is a point that is uniquely determined from the outer edge of the image.
- the position information may be position information associated with the real space, or may be relative position information associated with images input continuously.
- the center position acquisition unit 11 has a function of detecting a camera motion (motion vector) based on the input image and the image input immediately before it in order to acquire the position information.
- the center position acquisition unit 11 has a function of calculating the position information of the center point of the input image based on the obtained motion vector and the position information of the center point of the image input immediately before. .
- the center position acquisition unit 11 acquires the position information of the center point only for the first input image (initial first image), and the subsequent input image (second image).
- the position information of the center point is acquired based on the motion vector obtained using the input image and the image immediately before the input. For example, for the second image input at the nth time (n> 1), a motion vector is acquired using the second image and the second image input at the (n-1) th time, and based on the acquired motion vector The position information of the center point of the second image input for the nth time is acquired.
- the center position acquisition unit 11 may calculate the motion vector by using the image which is not only the image input immediately before but the image input immediately before, and further reduces only the luminance element.
- the center point of each image constituting the first image will be described as the first center point
- the center point of the second image will be described as the second center point.
- the center position acquisition unit 11 has a function of outputting the acquired position information of the center point to the composite image generation unit 12.
- the composite image generation unit 12 has a function of generating a composite image obtained by connecting the input image (second image) and the already input image (first image). Part 122 is provided.
- the distance calculation unit 121 has a function of specifying an image that overlaps the second image among the images constituting the first image based on, for example, the motion vector acquired by the center position acquisition unit 11. Then, the distance calculation unit 121 specifies the first center point closest to the predetermined position of the image that overlaps the second image, and calculates the distance between the specified first center point and the second center point of the second image. It has a function.
- the predetermined position is a position at a grid point arranged in a grid pattern. For example, lattice points are arranged in the composite image (here, the first image) to obtain the above-described predetermined position.
- the distance calculation unit 121 specifies the first center point closest to the grid point for each grid point before calculating the distance between the first center point and the second center point, and stores the first center point in the center position storage unit 13 in advance.
- Has the function of storing That is, the center position storage unit 13 stores a lattice point included in the first image and the first center point closest to the lattice point in association with each other.
- the distance calculation unit 121 identifies an image that overlaps the input second image among the images constituting the first image, refers to the center position storage unit 13, and is closest to the identified grid point. One center point is acquired.
- the distance calculation unit 121 calculates the distance between the first center point and the second center point that are different for each lattice point.
- the distance calculation unit 121 has a function of outputting the calculated distance to the synthesis unit 122.
- the synthesizing unit 122 has a function of joining the first image and the second image based on the distance between the first center point and the second center point calculated by the distance calculation unit 121. For example, the synthesizing unit 122 determines the pixel at the predetermined position based on the distance from the predetermined first position to the vertical bisector between the first center point and the second center point at the predetermined position in the combined image. Has a function to determine the value.
- 3 and 4 are schematic diagrams for explaining a perpendicular bisector between the first center point and the second center point. As shown in FIG. 3, a perpendicular bisector L1 can be drawn between the first center point P1 of the first image F1 and the second center point P2 of the second image F2.
- the distance to the vertical bisector L1 at a predetermined position in the composite image is calculated, and the pixel value at the predetermined position is determined.
- the first image is composed of a plurality of images F1 and F2
- each of the first center points P1 and P2 is bisected vertically with the second center point P3.
- Lines L2 and L3 can be drawn.
- a plurality of vertical bisectors can be drawn.
- a distance to a vertical bisector at a predetermined position in the composite image is calculated, and a pixel value at the predetermined position is determined.
- the combining unit 122 determines the distance to the vertical bisector in order to adopt the pixel value of the image closest to the combining position as the pixel value of the combining position among the first image and the second image.
- the synthesizing unit 122 uses the distance from a predetermined position in the synthesized image to the vertical bisector as an evaluation value for evaluating the proximity of the input image. For example, the synthesizing unit 122 evaluates the closeness T of the input image using the following Expression 1.
- A is a distance from a predetermined position to be synthesized to the second center point
- B is a distance from the predetermined position to be synthesized to the nearest first center point
- C is a distance from the first center point to the second center point. The distance to the point.
- the combining unit 122 determines a pixel value at a predetermined position using the proximity T obtained by Expression 1 as an evaluation value. For example, when the distance from the predetermined position to the vertical bisector is greater than a predetermined value and the predetermined position is closer to the first center point than the second center point, the synthesizing unit 122 performs pixel processing for the first image. The value is the pixel value at the predetermined position. On the other hand, when the distance from the predetermined position to the vertical bisector is greater than the predetermined value and the predetermined position is closer to the second center point than the first center point, the synthesizing unit 122 performs pixel processing for the second image. The value is the pixel value at the predetermined position.
- the combining unit 122 combines the pixel value of the first image and the pixel value of the second image to generate a pixel at the predetermined position.
- Value As a method of combining, a conventional method can be employed. For example, a method of using an average value or a weighted average value of the pixel value of the first image and the pixel value of the second pixel as the pixel value at the predetermined position is used. It is done.
- the combining unit 122 determines whether the predetermined position of the combined image is closer to the first center point or the second center point with the vertical bisector as a boundary, and determines the first image and the second image. It has a function of determining which pixel value of the image is adopted. Then, when the predetermined position is in the vicinity of the vertical bisector, that is, for the predetermined position where the distance from the vertical bisector in the composite image is equal to or less than the predetermined value, the synthesizing unit 122 In addition, by combining the pixel values of the second image and the pixel values of the second image, the luminance difference between the joints is reduced to generate a composite image with less discomfort.
- the pixel value at a predetermined position is determined using the closeness T as an evaluation value.
- T the closeness
- W the predetermined value W used to determine whether or not to combine pixel values.
- the pixel value of the first image and the pixel value of the second image are combined at an image position within 8 pixels with respect to the vertical bisector.
- the grid points in the composite image are arranged in a grid pattern so as to include the first image before the second image is input.
- the lattice points are newly added so as to include not only the first image but also the second image.
- the combining unit 122 can also read the first center point closest to the lattice point at high speed by referring to the center position storage unit 13.
- the synthesizing unit 122 has a function of determining the pixel value in the block surrounded by the lattice points based on the determination result at the lattice points in order to further increase the speed. For example, the synthesizing unit 122 is based on the distance from the lattice point arranged in a lattice pattern in the composite image to the perpendicular bisector between the first center point and the second center point closest to the lattice point. It has a function of determining pixel values at the lattice points.
- the synthesizing unit 122 adopts the pixel value of the first image for the block (region) surrounded by the grid points whose pixel values are the first image. That is, the synthesizing unit 122 does not perform any processing on the block and sets the next block as a processing target.
- the synthesizing unit 122 adopts the pixel value of the second image for the block surrounded by the grid points whose pixel values are the second image. That is, the composition unit 122 copies the second image as it is for the block.
- the pixel value is a composite value of the pixel values of the first image and the second image.
- the proximity T at the pixel position in the block is obtained by linear interpolation from the proximity T at the lattice point, and evaluated by the above-described evaluation method, thereby appropriately specifying the pixel position to be synthesized. Can do.
- the synthesizing unit 122 has a function of updating the nearest first center point recorded for each lattice point after generating a synthesized image by the above processing. For example, when a composite image is generated by connecting the first image and the second image, the closest first center point may be changed depending on the lattice points included in the composite image. For this reason, by performing the process of updating the nearest first center point after executing the joining process, accurate information on the nearest first center point can be maintained.
- the composition unit 122 overwrites and saves the composite image created by joining the first image and the second image in the first temporary storage area.
- the composition unit 122 stores the latest composite image in the first temporary storage area. That is, when there is a second image to be input next, a process of joining the second image to the latest composite image (intermediate composite image) is executed.
- the synthesizing unit 122 does not record and hold all the images to be synthesized, but refers to them by sequentially synthesizing the input images, so that the images can be synthesized with a small amount of memory.
- the synthesizing unit 122 has a function of outputting the synthesized image stored in the first temporary storage area to the display unit 31.
- the display unit 31 is connected to the composite image generation unit 12 and has a function of notifying the user of the output composite image. For example, a liquid crystal display or the like is used as the display unit 31.
- FIG. 5 is a flowchart showing the operation of the image processing apparatus 1 according to the present embodiment.
- the control process shown in FIG. 5 is executed, for example, at the timing when the imaging function of the mobile terminal 2 is turned on, and is repeatedly executed at a predetermined cycle.
- FIGS. 6 and 7 are schematic diagrams when one image is joined to one image that has already been input
- FIGS. 8 and 9 are overviews when one image is joined to an image that has already been inputted and synthesized.
- FIG. 6 and 7 are schematic diagrams when one image is joined to one image that has already been input
- FIGS. 8 and 9 are overviews when one image is joined to an image that has already been inputted and synthesized.
- the image processing apparatus 1 executes an initial process (S12).
- the image input unit 10 inputs an image F1 from the camera 30 and stores the image F1 in the first temporary storage area as the first image F1.
- the center position acquisition unit 11 acquires position information of the first center point P1, which is the center point of the first image F1.
- the composite image generation unit 12 arranges grid points K n (n: integer) in a grid pattern in the region including the first image F1.
- the composite image generating unit 12 specifies the first center point P1 as a first center point closest to the grid point K n, each grid point K n in association a first center point P1 center position storage section 13 To record. This completes the initial process.
- the process of S12 proceeds to the second image input process (S14).
- the image input unit 10 inputs the image F2 from the camera 30, and stores it as the second image F2 in the second temporary storage area.
- the second image F2 is an image of the same size captured at an imaging position different from the imaging position of the first image F1, and has an overlapping area with the first image F1.
- the center position acquisition unit 11 acquires the position information of the second center point P2, which is the center point of the second image F2. For example, the center position acquisition unit 11 acquires the position information of the second center point P2 based on the motion vectors of the first image F1 and the second image F2.
- the process of S16 ends, the process proceeds to a distance acquisition process between the center points (S18).
- the distance calculation unit 121 determines the first center based on the position information of the first center point P1 obtained in the process of S12 and the position information of the second center point P2 obtained in the process of S16. A distance between the point P1 and the second center point P2 is calculated. As shown in FIG. 6, the distance C between the first center point P1 of the first image F1 and the second center point P2 of the second image F2 is calculated. By calculating the distance C between the center points, the distance between the first center point P1 and the second center point P2 to the vertical bisector L1 is evaluated, and the vertical bisector L1 is determined as the image F1 and the image F1. It can be connected to F2. When the process of S18 is completed, the process proceeds to the synthesis process (S20).
- the composition unit 122 joins the image F1 and the image F2 to generate a composite image.
- the first image F1 and the second image F2 are arranged in the coordinate space of the composite image.
- the grid point Km (m: integer, dotted line in a figure) is newly added.
- the additional grid points K m, combining unit 122 identifies the first center point P1 as a first center point closest to the grid point K m, associating a first center point P1 to the respective grid point K m And recorded in the center position storage unit 13. At this time, a point at infinity is set in P1.
- the synthesis unit 122 evaluates the distance from the vertical bisector L1 for each grid point K n , K m arranged in the first image F1 and the second image F2, and the grid point K n , determining the pixel values in the K m. For example, to calculate a closeness T using equation 1 as described above in order from the lattice point K n located at the upper left. Then, as a 16 predetermined value W which is used to determine whether the synthesis of the pixel value, to assess the closeness T, determining the pixel values at the grid points K n.
- the synthesis unit 122 refers to the center position storage unit 13 and acquires the closest first center point P1, and the distance C between the first center point P1 and the second center point P2. Then, the distance A to the first center point P1 and the distance B to the second center point P2 are calculated, and the closeness T is calculated using Equation 1. Similar processing is performed for the lattice point X2. Combining unit 122, for the grid point X1, X2, since a grid point K n which are arranged in the first image F1, performs a process of evaluation by calculating a closeness T as described above. On the other hand, for the newly added lattice point K m , the proximity T is clearly below the threshold value, so the proximity T of these lattice points K m is set to ⁇ , and the calculation of the proximity T is omitted.
- the synthesizing unit 122 evaluates the calculated proximity T for each grid point K n , and all the proximity T of each of the grid points K n are larger than 16 for the blocks surrounded by the grid points K n at the four corners. In this case, the processing is skipped for the block. For example, for the lattice points X3 to X6, since the closeness T is all greater than 16, processing is skipped for blocks surrounded by the lattice points X3 to X6. On the other hand, 4 closeness T corners of the grid points K n is when all smaller than -16 employs the pixel value of the second image F2 as the pixel value of the block.
- the second image F2 is copied for the block surrounded by the lattice points X7 to X10. Then, 4 if closeness T corners of the grid points K n is not greater than all 16, or 4 if closeness T corners of the grid points K n is not less than all -16, pixel of the block
- the pixel values of the first image F1 and the second image F2 are synthesized as values. For example, the proximity T is larger than 0 for the lattice points X11 and X12, and the proximity T is smaller than 0 for the lattice points X13 and X14.
- the pixel values of the first image F1 and the second image F2 are synthesized.
- the proximity T at the pixel position in the block is linearly interpolated with the proximity T of X11 to X14, the proximity T is calculated at each pixel position, and the threshold W is evaluated.
- This evaluation method is the same as described above.
- a weighted average of the pixel value of the first image F1 and the pixel value of the second image F2 is calculated, and the pixel at the pixel position is calculated. Value.
- the combining unit 122 is a process of updating the first center point P ⁇ b> 1 recorded in the center position storage unit 13. Since the image F2 is synthesized, there are two first center points P1 and P2. Thus, combining unit 122, the first image to update the first center point closest to the grid point K n stored in the first temporary storage area. For example, in the case of the grid point X2, since the first center point P1 is closer to the first center point P1, P2, the update is not executed because it is the same as the previous time. On the other hand, for example, in the case of the grid point X1, since the first center point P2 is closer to the first center points P1 and P2, the stored information in the center position storage unit 13 is updated.
- the process of S22 ends, the process proceeds to a process for determining the presence or absence of an input image (S24).
- the image input unit 10 determines whether there is an image to be further input. For example, when the current imaging count is smaller than the auto continuous imaging count, it is determined that an input image exists. If it is determined in step S24 that an input image exists, the process proceeds to image input processing again (S14). Then, for example, the image F3 is input and stored in the second temporary storage area. Then, the center position acquisition unit 11 acquires the position of the center point P3 of the image F3 (S16). Then, the distance calculation unit 121 calculates the distance between the first center points P1 and P2 of the composite image composed of the images F1 and F2 stored in the first temporary storage area and the second center point P3 of the input image F2. Each is calculated (S18).
- the synthesizing unit 122 connects the images F1 and F2 and the image F3 to generate a synthesized image.
- the first images F1, F2 and the second image F3 are arranged in the coordinate space of the composite image.
- new grid point K m (m: integer, dotted line in the figure) is added.
- the additional grid points K m, combining unit 122 identifies the first central point P3 as the first center point closest to the grid point K m, associating a first central point P3 to each grid point K m And recorded in the center position storage unit 13.
- the synthesizing unit 122 has lattice points K n and K m (for example, X15, X16, and the like) arranged in the first image F1 and F2 and the second image F3.
- the distance from the vertical bisectors L2 and L3 is evaluated every X17, etc., and the pixel values at the lattice points K n and K m are specified.
- an image in which the images F1, F2, and F3 are combined is generated.
- the synthesis unit 122 updates the center point position of the grid point K n (S22). As described above, when there is an input image, the processes of S14 to S24 are repeatedly executed.
- step S24 if it is determined in step S24 that there is no input image, the process proceeds to display processing (S26).
- the image processing apparatus 1 outputs the composite image stored in the first temporary storage area to the display unit 31 for display.
- the image processing apparatus 1 may cut out both ends of the composite image, adjust the size, and output the combined image to the display unit 31.
- the control process shown in FIG. 5 ends.
- the process of S26 may be performed every time one image is input (that is, between S20 and S24).
- the image processing program includes a main module, an input module, and an arithmetic processing module.
- the main module is a part that comprehensively controls image processing.
- the input module operates the mobile terminal 2 so as to acquire an input image.
- the arithmetic processing module includes a center position acquisition module, a distance calculation module, and a synthesis module. Functions realized by executing the main module, the input module, and the arithmetic processing module are the functions of the image input unit 10, the center position acquisition unit 11, the distance calculation unit 121, and the synthesis unit 122 of the image processing apparatus 1 described above. It is the same.
- the image processing program is provided by a storage medium such as a ROM or a semiconductor memory, for example.
- the image processing program may be provided as a data signal via a network.
- the center position acquisition unit 11 performs the first center point of each of the images constituting the first image and the second of the second image.
- Position information of the center point is acquired
- the composite image generation unit 12 acquires a first center point of an image that overlaps the second image among images constituting the first image, and the acquired position information of the first center point
- the obtained perpendicular bisector between the first center point and the second center point is connected as a joint between the first image and the second image, and a composite image is generated.
- the first image and the input second image are sequentially combined, it is possible to identify and connect the joints of the images based on the information on the center points of the images. Can be realized. Furthermore, since the amount of deviation between the first image and the second image can be reduced by using the vertical bisector, the quality of the composite image can be improved.
- the composite image generation unit 12 causes the pixel value at the predetermined position of the composite image to be the first center point closest to the predetermined position. And the second center point are determined based on the distance from the perpendicular bisector, so that it is possible to synthesize an image by simple arithmetic processing.
- the region where the pixel values are combined can be limited to a region within a predetermined range from the vertical bisector. Can be performed at high speed, and a smooth composite image can be generated while reducing the effect on the composite image even when the second image has blurring or misalignment, for example.
- the first center point of the image closest to the lattice point is recorded, so the first image and the second image overlap. For each pixel in the region, it becomes unnecessary to compare the center point of all the images constituting the first image with the center point of the second image. For this reason, it is possible to reduce the processing load while reducing processing time or processing cost.
- the pixel value in the block can be determined without calculating all the distances at the respective pixel positions in the block.
- it is possible to determine whether or not to copy for each block surrounded by the grid points, and to execute the synthesis process in units of blocks it is possible to further reduce the processing load.
- the above-described embodiment shows an example of the image processing apparatus according to the present invention.
- the image processing apparatus according to the present invention is not limited to the image processing apparatus 1 according to the embodiment, and the image processing apparatus according to the embodiment may be modified or otherwise changed without changing the gist described in each claim. It may be applied to the above.
- the camera 30 may capture a moving image.
- the image input unit 10 may have a function of extracting continuous images from the captured moving image.
- the image input by the image input unit 10 may be an image transmitted from another device via a network.
- the size of the image captured by the camera 30 is described as being the same. However, the size of the captured image may be different for each imaging.
- the center position acquisition unit 11 calculates the motion vector using the input image and the image input immediately before has been described.
- the motion vector calculation method is not limited thereto. Is not something
- the motion vector may be calculated using the input image and the synthesized image generated so far.
- the area surrounded by the lattice points is described as a rectangle, but it may be a triangle or another polygon.
- SYMBOLS 1 DESCRIPTION OF SYMBOLS 1 ... Image processing apparatus, 10 ... Image input part, 11 ... Center position acquisition part, 12 ... Synthetic image generation part, 121 ... Distance calculation part, 122 ... Composition part, 13 ... Center position memory
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Editing Of Facsimile Originals (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (11)
- 1枚の画像又は前記画像を複数つなぎ合わせて構成される第1画像と入力された第2画像とを当該第2画像の入力の度につなぎ合わせて合成画像を逐次生成する画像処理装置であって、
前記第1画像を構成する画像それぞれの中心点である第1中心点の位置情報、及び前記第2画像の中心点である第2中心点の位置情報を取得する中心位置取得部と、
前記第1画像を構成する画像のうち前記第2画像と重なる画像の前記第1中心点を取得し、取得された前記第1中心点の位置情報及び前記第2中心点の位置情報に基づいて、取得された前記第1中心点と前記第2中心点との垂直二等分線を前記第1画像及び前記第2画像のつなぎ目としてつなぎ合わせ前記合成画像を生成する合成画像生成部と、
を備えることを特徴とする画像処理装置。 - 前記第1画像は、前記第2画像の直前に入力された画像又は前記合成画像生成部により生成された前記合成画像である請求項1に記載の画像処理装置。
- 前記中心位置取得部は、前記第1画像及び前記第2画像に基づいて動きベクトルを取得し、取得された前記動きベクトルに基づいて前記位置情報を取得する請求項1又は2に記載の画像処理装置。
- 前記合成画像生成部は、前記合成画像の所定位置の画素値を、当該所定位置に最も近い前記第1中心点と前記第2中心点との垂直二等分線からの距離に基づいて決定する請求項1~3の何れか一項に記載の画像処理装置。
- 前記合成画像生成部は、
前記距離が所定値よりも大きく、かつ前記所定位置が前記第2中心点よりも前記第1中心点に近い場合には、前記第1画像の画素値を当該所定位置の画素値とし、
前記距離が所定値よりも大きく、かつ前記所定位置が前記第1中心点よりも前記第2中心点に近い場合には、前記第2画像の画素値を当該所定位置の画素値とし、
前記距離が所定値以下の場合には、前記第1画像の画素値と前記第2画像の画素値とを合成して当該所定位置の画素値とすること、
を特徴とする請求項4に記載の画像処理装置。 - 前記合成画像生成部は、前記所定位置を格子状に配列させた格子点での位置とし、前記格子点ごとに当該格子点に最も近い第1中心点を記録する請求項4又は5に記載の画像処理装置。
- 前記合成画像生成部は、前記格子点ごとに求めた前記距離に基づいて、前記格子点に囲まれたブロック内の画素値を決定する請求項6に記載の画像処理装置。
- 前記合成画像生成部は、
前記ブロックを囲む全ての前記格子点の前記距離が所定値よりも大きく、かつ前記ブロックを囲む全ての前記格子点の位置が前記第2中心点よりも前記第1中心点に近い場合には、前記第1画像の画素値を当該ブロック内の画素値とし、
前記ブロックを囲む全ての前記格子点の前記距離が所定値よりも大きく、かつ前記ブロックを囲む全ての前記格子点の位置が前記第1中心点よりも前記第2中心点に近い場合には、前記第2画像の画素値を当該ブロック内の画素値とすること、
を特徴とする請求項7に記載の画像処理装置。 - 前記合成画像生成部は、前記合成画像を生成した後に、格子点ごとに記録された最も近い前記第1中心点を更新する請求項6~8の何れか一項に記載の画像処理装置。
- 1枚の画像又は前記画像を複数つなぎ合わせて構成される第1画像と入力された第2画像とを当該第2画像の入力の度につなぎ合わせて合成画像を逐次生成する画像処理方法であって、
前記第1画像を構成する画像それぞれの中心点である第1中心点の位置情報、及び前記第2画像の中心点である第2中心点の位置情報を取得する中心位置取得ステップと、
前記第1画像を構成する画像のうち前記第2画像と重なる画像の前記第1中心点を取得し、取得された前記第1中心点の位置情報及び前記第2中心点の位置情報に基づいて、取得された前記第1中心点と前記第2中心点との垂直二等分線を前記第1画像及び前記第2画像のつなぎ目としてつなぎ合わせ前記合成画像を生成する合成画像生成ステップと、
を備えることを特徴とする画像処理方法。 - コンピュータを、1枚の画像又は前記画像を複数つなぎ合わせて構成される第1画像と入力された第2画像とを当該第2画像の入力の度につなぎ合わせて合成画像を逐次生成するように機能させる画像処理プログラムであって、
前記第1画像を構成する画像それぞれの中心点である第1中心点の位置情報、及び前記第2画像の中心点である第2中心点の位置情報を取得する中心位置取得部、及び
前記第1画像を構成する画像のうち前記第2画像と重なる画像の前記第1中心点を取得し、取得された前記第1中心点の位置情報及び前記第2中心点の位置情報に基づいて、取得された前記第1中心点と前記第2中心点との垂直二等分線を前記第1画像及び前記第2画像のつなぎ目としてつなぎ合わせ前記合成画像を生成する合成画像生成部、
として機能させることを特徴とする画像処理プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/068162 WO2012049768A1 (ja) | 2010-10-15 | 2010-10-15 | 画像処理装置、画像処理方法及び画像処理プログラム |
CN201080012144.9A CN102656604B (zh) | 2010-10-15 | 2010-10-15 | 图像处理装置、图像处理方法以及图像处理程序 |
JP2010542864A JP5022498B2 (ja) | 2010-10-15 | 2010-10-15 | 画像処理装置、画像処理方法及び画像処理プログラム |
EP10848624.2A EP2490172B1 (en) | 2010-10-15 | 2010-10-15 | Image processing device, image processing method and image processing program |
US13/259,031 US8682103B2 (en) | 2010-10-15 | 2010-10-15 | Image processing device, image processing method and image processing program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2010/068162 WO2012049768A1 (ja) | 2010-10-15 | 2010-10-15 | 画像処理装置、画像処理方法及び画像処理プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012049768A1 true WO2012049768A1 (ja) | 2012-04-19 |
Family
ID=45934215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/068162 WO2012049768A1 (ja) | 2010-10-15 | 2010-10-15 | 画像処理装置、画像処理方法及び画像処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8682103B2 (ja) |
EP (1) | EP2490172B1 (ja) |
JP (1) | JP5022498B2 (ja) |
CN (1) | CN102656604B (ja) |
WO (1) | WO2012049768A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7983835B2 (en) | 2004-11-03 | 2011-07-19 | Lagassey Paul J | Modular intelligent transportation system |
CN105144687B (zh) * | 2013-04-30 | 2019-07-26 | 索尼公司 | 图像处理装置、图像处理方法及计算机可读介质 |
US9392166B2 (en) | 2013-10-30 | 2016-07-12 | Samsung Electronics Co., Ltd. | Super-resolution in processing images such as from multi-layer sensors |
US10217257B1 (en) * | 2015-03-17 | 2019-02-26 | Amazon Technologies, Inc. | Process for contextualizing continuous images |
EP3557865A4 (en) * | 2017-01-23 | 2020-01-08 | NTT DoCoMo, Inc. | INFORMATION PROCESSING SYSTEM AND INFORMATION PROCESSING DEVICE |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002042109A (ja) * | 2000-07-21 | 2002-02-08 | Topcon Corp | 医用画像合成処理装置及び方法並びに記憶媒体 |
JP2006119730A (ja) * | 2004-10-19 | 2006-05-11 | Seiko Epson Corp | 画像のつなぎ合わせ |
JP2006345400A (ja) * | 2005-06-10 | 2006-12-21 | Matsushita Electric Ind Co Ltd | ビデオカメラ装置 |
JP2009033224A (ja) * | 2007-07-24 | 2009-02-12 | Nippon Hoso Kyokai <Nhk> | 合成画像生成装置および合成画像生成プログラム |
JP2009033392A (ja) * | 2007-07-26 | 2009-02-12 | Morpho Inc | パノラマ画像生成装置およびプログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6236748B1 (en) * | 1994-08-02 | 2001-05-22 | Canon Kabushiki Kaisha | Compound eye image pickup device utilizing plural image sensors and plural lenses |
US5727093A (en) * | 1995-02-07 | 1998-03-10 | Canon Kabushiki Kaisha | Image processing method and apparatus therefor |
US6549681B1 (en) * | 1995-09-26 | 2003-04-15 | Canon Kabushiki Kaisha | Image synthesization method |
JP2004334843A (ja) * | 2003-04-15 | 2004-11-25 | Seiko Epson Corp | 複数の画像から画像を合成する方法 |
JP4560716B2 (ja) * | 2004-09-28 | 2010-10-13 | アイシン精機株式会社 | 車両の周辺監視システム |
-
2010
- 2010-10-15 EP EP10848624.2A patent/EP2490172B1/en not_active Not-in-force
- 2010-10-15 US US13/259,031 patent/US8682103B2/en active Active
- 2010-10-15 WO PCT/JP2010/068162 patent/WO2012049768A1/ja active Application Filing
- 2010-10-15 CN CN201080012144.9A patent/CN102656604B/zh not_active Expired - Fee Related
- 2010-10-15 JP JP2010542864A patent/JP5022498B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002042109A (ja) * | 2000-07-21 | 2002-02-08 | Topcon Corp | 医用画像合成処理装置及び方法並びに記憶媒体 |
JP2006119730A (ja) * | 2004-10-19 | 2006-05-11 | Seiko Epson Corp | 画像のつなぎ合わせ |
JP2006345400A (ja) * | 2005-06-10 | 2006-12-21 | Matsushita Electric Ind Co Ltd | ビデオカメラ装置 |
JP2009033224A (ja) * | 2007-07-24 | 2009-02-12 | Nippon Hoso Kyokai <Nhk> | 合成画像生成装置および合成画像生成プログラム |
JP2009033392A (ja) * | 2007-07-26 | 2009-02-12 | Morpho Inc | パノラマ画像生成装置およびプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2490172A4 * |
Also Published As
Publication number | Publication date |
---|---|
CN102656604B (zh) | 2014-10-22 |
CN102656604A (zh) | 2012-09-05 |
US8682103B2 (en) | 2014-03-25 |
EP2490172A4 (en) | 2018-01-17 |
JP5022498B2 (ja) | 2012-09-12 |
EP2490172B1 (en) | 2019-03-27 |
EP2490172A1 (en) | 2012-08-22 |
JPWO2012049768A1 (ja) | 2014-02-24 |
US20120093436A1 (en) | 2012-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4941950B1 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
CN107256532B (zh) | 图像处理装置、图像处理方法以及记录介质 | |
US8233055B2 (en) | Image processing device and program | |
US9609181B2 (en) | Image signal processor and method for synthesizing super-resolution images from non-linear distorted images | |
JP6682559B2 (ja) | 画像処理装置、画像処理方法、画像処理プログラム及び記憶媒体 | |
US20160188992A1 (en) | Image generating device, electronic device, image generating method and recording medium | |
JP2007226643A (ja) | 画像処理装置 | |
JP7093015B2 (ja) | パノラマ映像合成装置、パノラマ映像合成方法、及びパノラマ映像合成プログラム | |
JP5022498B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
US8861846B2 (en) | Image processing apparatus, image processing method, and program for performing superimposition on raw image or full color image | |
WO2005024723A1 (ja) | 画像合成システム、画像合成方法及びプログラム | |
JP2010181637A (ja) | 表示制御装置および表示制御方法 | |
JP5493112B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP4714038B2 (ja) | 画像の高解像度化方法及び装置 | |
JP4128123B2 (ja) | 手ぶれ補正装置、手ぶれ補正方法および手ぶれ補正プログラムを記録したコンピュータ読み取り可能な記録媒体 | |
JP4930304B2 (ja) | 画像処理装置、画像処理方法、プログラム、及び記録媒体 | |
JP5687370B2 (ja) | 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体 | |
JP5544497B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP6635822B2 (ja) | 画像処理装置及び画像処理方法 | |
JP2006119728A (ja) | 合成画像の表示 | |
JP2007243333A (ja) | 手ブレ補正方法、手ブレ補正装置、および撮像装置 | |
JP2015109593A (ja) | 画像処理装置、画像処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080012144.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010542864 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13259031 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010848624 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10848624 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |