WO2005046221A1 - 画像処理装置、画像処理方法、そのプログラムおよび記録媒体 - Google Patents
画像処理装置、画像処理方法、そのプログラムおよび記録媒体 Download PDFInfo
- Publication number
- WO2005046221A1 WO2005046221A1 PCT/JP2004/017128 JP2004017128W WO2005046221A1 WO 2005046221 A1 WO2005046221 A1 WO 2005046221A1 JP 2004017128 W JP2004017128 W JP 2004017128W WO 2005046221 A1 WO2005046221 A1 WO 2005046221A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- image
- image processing
- shift amount
- still
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 13
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 30
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 29
- 238000000034 method Methods 0.000 claims description 62
- 230000002194 synthesizing effect Effects 0.000 claims description 20
- 238000006073 displacement reaction Methods 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 6
- 230000007717 exclusion Effects 0.000 claims description 4
- 238000001514 detection method Methods 0.000 abstract description 7
- 238000000605 extraction Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 12
- 239000000203 mixture Substances 0.000 description 8
- 239000000463 material Substances 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- 238000004091 panning Methods 0.000 description 3
- 239000002131 composite material Substances 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 241000219977 Vigna Species 0.000 description 1
- 235000010726 Vigna sinensis Nutrition 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 235000012771 pancakes Nutrition 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
Definitions
- Image processing apparatus image processing method, program thereof, and recording medium
- the present invention relates to an image processing device that generates one still image using a plurality of images, an image processing method, a computer program, and a recording medium.
- an image processing apparatus and an image processing method for generating a single high-resolution image by combining a plurality of images For example, a technique is known in which one scene is determined from a moving image captured by a digital video camera, and a still image having a higher resolution (higher pixel density) than the determined one frame image is known.
- one frame image is selected as a reference image from a series of (n + 1) frame images, and the motion vectors of the other n frame images (target images) with respect to this reference image are respectively determined.
- a still image is generated by synthesizing (n + 1) frame images (see, for example, Japanese Patent Application Laid-Open No. 2000-240845). . It is said that such an image synthesis processing method can achieve high quality and clear images compared to a method of simply converting the resolution of a one-frame image. Disclosure of the invention
- An object of the present invention is to solve such a problem and execute efficient image processing when generating one high-resolution image from a plurality of images.
- the first image processing apparatus of the present invention employs the following method in order to solve at least a part of the above problems.
- an image processing apparatus that generates a still image with a high pixel density from a plurality of images, an image extracting unit that extracts a plurality of images used for generating the still images, and a shift amount between the extracted plurality of images.
- Deviation amount detecting means for detecting the deviation amount; identifying means for identifying two or more images from the plurality of extracted images based on the detected deviation amount; and synthesizing the identified two or more images.
- the gist of the invention is to provide an image synthesizing means for generating one still image.
- a first image processing method is an image processing method for generating a still image having a high pixel density from a plurality of images, the method comprising: extracting a plurality of images used for generating the still images; Detecting a shift amount between the plurality of images, and identifying two or more images from the plurality of extracted images based on the detected shift amounts, and synthesizing images other than the identified two images.
- the purpose is to generate one still image.
- the first image processing apparatus and the image processing method (referred to as first image processing)
- a shift amount between a plurality of images is detected, and two or more images are combined based on the shift amount.
- image processing a configuration is provided in which an image serving as a reference for synthesizing the one still image is provided, and a plurality of images are assigned to the specified image in accordance with a predetermined sequence. An image may be extracted. According to such image processing, extraction of a plurality of images to be used for synthesizing a still image having a high pixel density is performed in a previously associated order based on a specified image. Multiple images to be used for composition are automatically extracted by specifying one image.
- the plurality of images may be a plurality of images that are continuous in time series, and the associated order may be an order that is continuous in time series from the specified image.
- the image extraction process can be simplified.
- the number of images used for synthesizing the images may be displayed prior to generation of one static ih image.
- the user can easily recognize the number of images used for the image synthesis processing.
- the number of images not used for the composition may be displayed.
- a warning may be displayed when the number of images other than the specified two does not reach the predetermined number. Further, when the number of the specified two or more images does not reach the predetermined number, whether or not to execute the synthesis of the images may be selected. According to such image processing, a warning is given to the user in advance that the number of images used for the image synthesis processing does not reach the predetermined number, so that effective sharpening of one generated still image cannot be expected; ⁇ it can.
- the synthesizing process can be stopped at an early stage without performing the synthesizing process which requires a long processing time. For example, when the number of detected images exceeds at least one predetermined threshold value by setting a predetermined number of images, the synthesizing process can be immediately stopped.
- images in which the detected shift amount exceeds a predetermined threshold are excluded from the plurality of extracted images, and images other than the images thus excluded are replaced with the two or more images.
- the image may be specified as the image.
- the predetermined threshold may be set as a specific value, or may be set as the number of pixels of the image to be synthesized. On the other hand, it is also possible to set a predetermined ratio. For example, it may be determined as several percent to 10 percent of the number of vertical and horizontal pixels. Of course, a fixed value may be used. For example, in the case of a translational deviation amount, the predetermined threshold value for determination may be about ⁇ 16 pixels, and in the case of a rotational deviation, the predetermined threshold value for determination may be about ⁇ 1 °.
- an image in which the translation error between the images exceeds ⁇ 16 pixels and the rotation error or the rotation error exceeds ⁇ 1 ° is excluded from the image synthesis target, or the synthesis process is stopped.
- the image to be synthesized may contain blur. Therefore, by making a judgment based on this threshold value, it is possible to perform an image combining process while excluding images that are unlikely to contribute to image combining, or to cancel the image combining process.
- the plurality of images used for the image processing may be a plurality of frame images included in a moving image. Usually, it is particularly effective when a plurality of low-resolution frame images are combined to generate one high-resolution still image.
- the plurality of images used in the image processing can be a plurality of still images having information on exposure time that changes according to the brightness of the shooting target at the time of shooting, and based on the exposure time, May be set to the predetermined threshold value.
- a threshold value for a shift amount between images is set for each still image from information on the exposure time at the time of photographing of the still image.
- the deviation is detected Based on the time ratio between the photographing time interval and the exposure time of the two images to be processed, the allowable deviation amount within the exposure time of one still image is examined, and a threshold is set.
- a threshold value corresponding to each still image can be set adaptively, instead of a fixed threshold value.
- the present invention can also be implemented as a computer program product and a medium recording a computer program.
- FIG. 1 is an explanatory diagram showing an image processing system as a first embodiment of the present invention.
- FIG. 2 is a flowchart of the image processing of the first embodiment.
- FIG. 3 is an explanatory diagram showing a positional shift between two images.
- FIG. 4 is an explanatory diagram showing a method of calculating the amount of translational deviation by the gradient method.
- FIG. 5 is an explanatory diagram schematically showing the amount of rotation deviation of a pixel.
- FIG. 6 is an explanatory diagram of a relationship between an image shift amount and a range contributing to sharpening.
- FIG. 7 is a flowchart of the image processing of the second embodiment.
- FIG. 8 is an explanatory diagram of a deviation amount detection procedure in the second embodiment.
- FIG. 9 is a flowchart of the image processing of the third embodiment.
- FIG. 10 is an explanatory diagram of a relationship between a shooting cycle of a still image and an exposure time.
- a 1. Configuration of image processing device :
- FIG. 1 is an explanatory diagram showing an image processing system 100 as a first embodiment of the present invention.
- the image processing system 100 includes an image database 20 that supplies image data such as a moving image and a still image, and an image that executes image processing on a plurality of images input from the image database 20. It is composed of a personal computer 30 as a processing device, a user interface 40 for a user to instruct execution of image processing, a color printer 50 for outputting an image subjected to image processing, and the like.
- Image Data Base 20 has equipment to handle images such as digital video talent 21, digital still camera 22, DVD 23, and hard disk 24, and supplies image data to personal computer 30. I do.
- the image data held in the image database 20 in the first embodiment is moving image data acquired by the digital video talent mera 21.
- the image data handled in the image processing of the present embodiment is called a frame image.
- This frame image is composed of a plurality of The personal computer 30, which is an image and a part of a moving image, includes a CPU 31 for executing image processing, a ROM 32, a RAM 33, a hard disk 34 on which image processing software is installed, and an image. It has an IZF circuit 35 for exchanging with external devices such as a database 20, a user interface 40, and a color printer 50.
- the image processing of the software installed on the hard disk 34 is a processing of synthesizing a plurality of input frame images to generate one high-resolution still image.
- the personal computer 30 on which this software is installed has the functions of “image extracting means”, “deviation amount detecting means”, “exclusion means”, and “image combining means” as image processing devices. The flow of this image processing will be described later in detail.
- the user interface 40 includes a keyboard 41 and a mouse 42 for a user to perform an image processing execution operation, and a display 43 for displaying a frame image before performing the image processing and a still image after the synthesis processing. ing.
- FIG. 2 is a flowchart of the image processing according to the first embodiment in which a plurality of pieces of image data are combined to generate one still image.
- the image processing installed in the personal computer 30 is started.
- the / ⁇ ° personal computer 30 inputs moving image data, which is a set of frame image data, from the image database 20 and reproduces the moving image data on the display 43.
- the user can replay a scene that he / she wants to output as a still image.
- Perform the operation of pausing the raw image and specify the scene (frame image) (step S200).
- the personal computer 30 extracts frame images to be used for image processing from the specified frame images in chronological order (step S210).
- frame images In this embodiment, four consecutive frame images are input in chronological order from the designated operation timing of the frame image.
- the one specified by the user that is, the first one in the time series
- the other frame images are the target frame images (F2 to F2). 4).
- the number of frame images to be extracted may be arbitrarily set by the user.
- the personal computer 30 detects a shift amount between the frame images of the four frame images designated and extracted in this way (step S220). As shown in Fig. 3, the displacement detected here is the "position displacement" between the two images.
- the translation displacements u and V represent the translational displacement, and the rotational displacement represents the rotational displacement. It is represented by the three elements of the quantity ⁇ .
- the personal computer 30 detects a shift amount (u, ⁇ , ⁇ ) between the reference frame image Fl and each of the three ⁇ f elephant frame images (F2 to F4). The method of detecting the shift amount will be described later.
- the personal computer 30 determines whether or not the shift amount between the frame images adjacent in time series is within a predetermined range (threshold). For example, if the shift amount between the target frame image F2 and the target frame image F3 is larger than the threshold value, it can be estimated that the digital pancake melody 21 1 is consciously moved and the pan is fast.
- the personal convenience server 30 determines the target frame image F 3 that may have blurred as a result of the determination.
- a process for excluding from the object of synthesis is performed (step S230).
- a threshold value of the lower limit is set for the translational deviation amounts u and V, and the judgment is made on the translational deviation amounts u and V between all the frame images.
- the image F2 and the target frame image F4 are the same image, and there is a case where it is not necessary to use the same target frame image for image synthesis. In other words, since there are almost no frame images (the same image), it is sufficient to use one of them for the image synthesis processing.
- a threshold value of 0. ⁇ pixel is set for the translation shift amount u, V between the frame images. If the detected shift amount u, V is 0.1 pixel or less, the synthesis is performed. Targets have been excluded.
- the personal computer 30 performs an image synthesizing process using the target frame image that has not been excluded and the reference frame image F1 (step S240). Specifically, the target frame image obtained by correcting the position shift in the reference frame image F 1
- the target frame image F 2 (For example, the target frame image F 2) is superimposed, and the gradation value of each pixel of the composite image is determined based on the gradation value of each pixel of both.
- the well-known bilinear method is used for the gradation value of each pixel of the composite image. Note that, instead of the bilinear method, another well-known method such as the dual neighbor method or the bicubic method may be used.
- the synthesis processing using this bilinear method is executed by j jets to generate one still image. In the present embodiment, even if the number of target frame images used in the combining process is reduced as a result of excluding the target frame images based on the predetermined threshold, the combining process is executed as it is.
- the personal computer 30 The number of images used for the composition is displayed on the display 43, but the number of images used for the composition may be displayed prior to the execution of the composition processing. Further, a time-series continuous frame image (fifth image) is further extracted from the image database 20 as a target frame image, and the processing from step S220 is repeated to obtain the target frame image to be used in the synthesis processing. The number may always be kept constant. Further, a frame image preceding the reference frame image F1 in time series may be extracted. The personal computer 30 displays the still image thus synthesized at a predetermined position on the display 43, and ends this processing. The user performs an operation of outputting the still image to a color printer 20 or a hard disk 34 as desired.
- This series of image processing does not contribute to increasing the resolution and sharpness of the synthesized image, such as an image in which the target frame image itself contains blurring or almost the same target frame image as the reference frame image F1.
- the target frame image is excluded in advance. Therefore, efficient image processing can be performed.
- the positional shift between the reference frame image F 1 and the target frame image F 2 is represented by the three frames (u, v, 6).
- the reference frame image F 1 has a rectangular coordinate system (X 1, y 1) with the origin at the center of the image, the X 1 axis in the horizontal direction, and the y 1 axis in the vertical direction.
- Figure 3 shows that the target frame image F 2 is rotated u relative to the reference frame image F 1 in the horizontal direction, v in the vertical direction, and the center of the target frame image.
- FIG. 4 is an explanatory diagram showing a method of calculating the amount of translation error by the gradient method.
- Figures 4 (a) and 4 (c) show the brightness of the pixels on each image
- Figures 4 (b) and 4 (d) show the principle of the gradient method.
- (X 1 i, y 1 i) indicates the coordinates of one pixel on the reference frame image F 1
- B l (x 1 i, y 1 i) indicates the luminance of that pixel .
- the pixel at the coordinates (x 2 i, y 2 i) on the target frame image F 2 is the coordinate (x 1 i to xli + l, y 1 i to y 1 i) on the reference frame image F 1. +1), and the coordinates are ( ⁇ ⁇ i + ⁇ , y 1 i + ⁇ y).
- the pixel at the coordinates (x2i, y2i) in the target frame image F2 is shifted to the coordinates (x1i + ⁇ X, y) on the reference frame image F1.
- ⁇ x that satisfies, it is possible to determine the translation displacement amount in the X-axis direction of the target frame image F2.
- ⁇ is calculated for each pixel and the average is calculated.
- the pixel at the coordinates (x 2 i, y 2 i) in the target frame image F 2 is at the coordinates (x 1 i, y 1 i + ⁇ y) on the reference frame image F 1,
- the translation deviation amount in the y-axis direction of the target frame image F 2 can be determined.
- ⁇ y is calculated for each pixel, and the average is taken. Since equation (3) above considers only the X-axis direction and equation (6) above considers only the y-axis direction, if this is extended in both the X-axis direction and the y-axis direction,
- FIG. 5 is an explanatory diagram schematically showing the amount of rotation deviation of a pixel.
- r is the distance from the origin 0 of the coordinates (X 1, y 1) of the reference frame image F 1 and 0 is the rotation angle from the X 1 axis.
- FIG. 6 is an explanatory diagram of a relationship between an image shift amount and a range contributing to sharpening.
- the horizontal axis represents translational amounts u and v
- the vertical axis represents rotational deviation ⁇
- the criteria that contribute to high resolution and sharpness are defined in areas (a), (b), and (c).
- the area (a) is the range of the shift between adjacent frame images caused by normal camera shake and panning
- the area (b) is the fast panning and the distance between adjacent frame images when the rotation operation is intentionally performed.
- the range of displacement, area (c), indicates the range of possible translation and rotation displacements.
- the range of the region (a) is a region that satisfies the condition of ⁇ 16 pixels ⁇ translational deviation u, ⁇ ⁇ ⁇ 6 pixels, and 1 ° ⁇ rotational deviation ⁇ 5 ⁇ 1 °. If the shift amount between the frame images is included in the region (a), it is determined that the frame image is a target frame image that contributes to sharpening of the image. In other words, ⁇ 16 pixels and ⁇ 1 ° were set as threshold values that contribute to sharpening of the image.
- the threshold value may be a fixed value as described above, but may be set according to image conditions, synthesis conditions, and the like.
- this threshold may be learned. The user is allowed to judge the quality of the combined image, and when it is determined that the image is satisfactory for sharpening, the threshold value is increased, and when it is determined that the image is not satisfactory for sharpening, the threshold value is decreased. And the user gradually learns the threshold for the image to be used. It is good.
- FIG. 7 is a flowchart of image processing according to the second embodiment for generating one still image by combining a plurality of pieces of image data.
- a judgment step for stopping the image synthesizing process is provided, which is different from the case of the first embodiment shown in FIG. Therefore, the processing common to the first embodiment will be briefly described.
- the hardware configuration of the image processing system according to the second embodiment is the same as that of the first embodiment, and thus the same reference numerals are used and the description is omitted.
- the personal computer 30 inputs the frame image data from the image data database 20 (step S400). .
- the image specified by the user is set as a reference frame image F 1, and three frame images continuous in time series from the reference frame image F 1 are set as the target frame image ( It is extracted as F2 to F4) (step S410).
- the personal computer 30 detects the shift amount (LI, v, (5)) between the frame images (step S420) .
- This shift amount detection processing is performed in the same manner as in the first embodiment. Detects the amount of displacement (u, V, ⁇ ) between each frame image of F1 and three target frame images (F2 to F4), and as shown in FIG. In the shift amount detection processing between the reference frame image F 1 and the target frame image F 2, the processing between the reference frame image F 1 and the target frame image F 2 is performed.
- the processing is called processing S2_3 and the processing between the target frame image F3 and the target frame image F4 is called processing S3_4.
- the personal computer 30 calculates the shift amount between the converted frame images. Is the deviation (u, ⁇ , ⁇ 5) between one adjacent frame image within the threshold of the first embodiment? (Step S430) More specifically, the ratio between the deviation (u, V, ⁇ ) detected in the processing S1-2 and the threshold value is determined. If the shift amount does not fall within the threshold value, a warning screen is displayed on the display 43 indicating that a clear image cannot be obtained even if the image combining process is executed (step S460), and the combining is performed.
- step S4665 Display a selection screen that allows the user to select whether or not to execute the processing. If the user selects not to execute the synthesis processing in step S465, the synthesis is performed. Stops processing (Step S470) and ends this series of image processing To do.
- step S430 when the deviation amount is within the threshold value, or in step S465, the user has selected that the deviation amount does not fall within the threshold value but to execute the combining process. In this case, it is determined whether or not the deviation amount (u, v, ⁇ ) has been confirmed for all the other adjacent frame images (step S440).
- the process returns to step S430 to detect the shift amount between the next adjacent frame images. Make a judgment. More specifically, if the deviation amount detected in the process S1-2 satisfies the threshold condition, the condition for the deviation amount in the process S2-3 is determined. If the deviation amount detected in the process S2-3 satisfies the condition of the threshold, the condition is determined for the deviation amount in the process S3-4. In this process, if there is at least one that does not satisfy the threshold, a warning is displayed (step S460) and the user is allowed to select whether or not to execute the combining process as described above. (Step S465).
- step S440 when the personal computer 30 determines that the amount of displacement between all adjacent frame images satisfies the threshold condition, the personal computer 30 executes the image combining process (step S450). ).
- the image combining process Before executing the synthesizing process, as in the first embodiment, a determination is made as to the translational deviation amounts u and V for eliminating the same or the same frame image.
- the personal computer 30 displays the synthesized still image on the display 43, and ends this series of image processing. This processing is the same as in the first embodiment.
- image processing of the second embodiment when an image that does not contribute to high resolution and sharpness of image synthesis is included, a warning is displayed and the user is asked whether to execute the image synthesis processing. To select.
- the user can easily determine whether sharpening of the image can be desired or not, and can select to execute the time-consuming image synthesizing process only when sharpening of the image is desired. Therefore, efficient image processing can be performed. Also, in general, in a moving image shot with a digital video talent, the amount of shift between adjacent frame images of a frame image that contributes to sharpening of a synthesized image does not change drastically. In the present embodiment, a warning is displayed if at least one of the detected shift amounts between adjacent frame images does not satisfy the condition. That is, it is possible to judge whether or not the ability to stop the image synthesis is at an early stage of the image processing. Therefore, a more efficient image processing system can be constructed.
- the determination of the amount of shift is performed, and the amount of shift between one adjacent frame image is detected to make a determination. Only when the condition between one adjacent frame image satisfies the condition, the shift amount between the next adjacent frame images may be detected. Further, in the present embodiment, if at least one image has a deviation amount exceeding the threshold value, a warning is issued to prompt the user to cancel the combining process. However, when the value is less than the allowable value, the combining process may be stopped. For example, if one of the four extracted frame images exceeds the threshold, the images that exceed the threshold are excluded, the three images are combined, and the images that exceed the threshold are processed. If there are two or more, the combining process may be stopped.
- the hardware configuration of the image processing system of the third embodiment is the same as that of the first embodiment, except that the image data held in the image database 20 (that is, the image synthesis processing is performed). Material). Therefore, the description is omitted assuming that the reference numerals of the respective devices are the same, and the image data to be handled is described.
- the image data handled in the third embodiment is a plurality of still image data shot in the continuous shooting mode of a digital still camera.
- This still image data is an Exif image file, and a JPEG image file.
- the image database 20 contains at least four Exif files taken at 30-second intervals.
- a still image with a 1/30 second interval is used, which includes an image taken automatically with the exposure time automatically changed according to the brightness of the subject.
- still images taken in a normal continuous shooting mode such as four or nine images per second may be used.
- FIG. 9 is a flowchart of image processing according to the third embodiment for generating one still image by combining a plurality of pieces of image data.
- the image processing installed in the personal computer 30 is started.
- the personal computer 30 reads a plurality of still image files shot in the continuous shooting mode from the image database 20 and displays them on the display 43.
- the user designates one desired still image from the displayed still images (step S500). This designation operation is performed by using the mouse 4 2 This is done by clicking on.
- the personal computer 30 that has received the instruction to specify one still image extracts three still images that are continuous in time series with the specified still image from the read still image file (step S510)
- a process of detecting a shift amount between the four still images is performed (step S520).
- the designated still image is called a reference image and the extracted still image is called a target image.
- the personal computer 30 sets a threshold value for the amount of deviation in order to determine whether or not each image can be used for synthesizing images (step S530).
- the threshold value is set for each target still image using the exposure time included in the shooting information of each image file without having a fixed threshold value as in the first embodiment. The setting of the threshold will be described later.
- the personal computer 30 determines whether or not the amount of displacement between the images is within a threshold set based on the exposure time. In the present embodiment, since the rotation shift amount ⁇ is small because it is a still image continuously shot by a digital still camera, it is determined whether or not the translation shift amounts u and V are within a threshold value. As a result of the determination, the personal combination user 30 performs a process of excluding the target image that does not meet the threshold condition from the targets of the composition (step S540). The personal computer 30 performs an image combining process using the target image and the reference image that have not been excluded (step S550). This processing is the same as in the first embodiment. The personal computer 30 displays the still image thus synthesized on the display 43, and ends this processing.
- FIG. 10 is an explanatory diagram of a relationship between a photographing cycle of still images photographed in the continuous shooting mode and an exposure time.
- FIG. 10 shows the exposure time Tp of still images (F1, F2, F3...) Continuously photographed in time series for each photographing cycle Tf.
- the photographing cycle Tf is 1/30 second at a constant interval, and the exposure time Tp in each still image changes for each still image.
- the shift amount (u, v, 6) between the still image F 1 and the still image F 2 that are continuous in time series is obtained. If the image is taken in the continuous shooting mode and the rotational deviation ⁇ is small, the moving amount m f of the subject between the images is
- This movement amount m f indicates the movement amount within the time of the imaging cycle T f.
- This movement amount mp indicates the movement amount within the exposure time Tp. If this movement amount mp exceeds 1 pixel, it can be determined that there is a possibility that blurring has occurred in one still image. Assuming that the permissible amount for this pre is a predetermined value mpt of 1 pixel or less, the moving amount f between images is
- the range that satisfies this conditional expression is set as the threshold for the amount of shift (U, V) between images.
- the allowable range of the shift amount between images is narrow, and for a still image with a short exposure time T p, the gap between the images is small.
- the allowable range of the amount is widened. Therefore, it is possible to set an appropriate threshold value of the shift amount for each still image having different shooting conditions.
- the photographing cycle T f is constant at 1/30 seconds.
- an image taken from the same angle on a different day may be selected as a material to be used in the image combining process.
- images taken simultaneously by two nearby digital bidet merchants, digital still cameras, etc. can be used as the material to be used for image compositing. In this case, even though the resolution of each of the two cameras is low, a high-resolution still image can be output by the combining process.
- the embodiments of the present invention have been described. It is needless to say that the present invention is not limited to the above, and can be implemented in various forms without departing from the spirit of the present invention. Industrial applicability
- the personal computer is described as the image processing apparatus of the present invention.
- the present invention can be applied to various devices such as a printer, a digital video camera, and a digital still camera.
- the image processing apparatus and the image processing method of the present invention may be realized independently on a computer, or may be implemented in a form incorporated in these devices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Television Signal Processing For Recording (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04799741A EP1684506A4 (en) | 2003-11-11 | 2004-11-11 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, PROGRAM THEREFOR AND RECORDING MEDIUM |
JP2005515385A JP4325625B2 (ja) | 2003-11-11 | 2004-11-11 | 画像処理装置、画像処理方法、そのプログラムおよび記録媒体 |
US10/578,635 US7738731B2 (en) | 2003-11-11 | 2004-11-11 | Image processing device, image processing method, program thereof, and recording medium |
US12/800,608 US7961985B2 (en) | 2003-11-11 | 2010-05-18 | Image processing apparatus, image processing method, and program product thereof |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-380614 | 2003-11-11 | ||
JP2003380614 | 2003-11-11 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/578,635 A-371-Of-International US7738731B2 (en) | 2003-11-11 | 2004-11-11 | Image processing device, image processing method, program thereof, and recording medium |
US12/800,608 Continuation US7961985B2 (en) | 2003-11-11 | 2010-05-18 | Image processing apparatus, image processing method, and program product thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005046221A1 true WO2005046221A1 (ja) | 2005-05-19 |
Family
ID=34567244
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/017128 WO2005046221A1 (ja) | 2003-11-11 | 2004-11-11 | 画像処理装置、画像処理方法、そのプログラムおよび記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (2) | US7738731B2 (ja) |
EP (1) | EP1684506A4 (ja) |
JP (1) | JP4325625B2 (ja) |
CN (1) | CN100440944C (ja) |
WO (1) | WO2005046221A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007010807A (ja) * | 2005-06-28 | 2007-01-18 | Fuji Xerox Co Ltd | 情報処理システム、情報処理装置、情報処理方法及びコンピュータ・プログラム |
JP2007036359A (ja) * | 2005-07-22 | 2007-02-08 | Casio Comput Co Ltd | 画像合成装置、画像合成方法及びプログラム |
JP2007151080A (ja) * | 2005-10-27 | 2007-06-14 | Canon Inc | 画像処理装置及び画像処理方法 |
JP2007305113A (ja) * | 2006-04-11 | 2007-11-22 | Matsushita Electric Ind Co Ltd | 画像処理方法および画像処理装置 |
JP2008033914A (ja) * | 2006-06-28 | 2008-02-14 | Matsushita Electric Ind Co Ltd | 画像読出し方法および画像拡大方法 |
WO2009082015A1 (en) * | 2007-12-21 | 2009-07-02 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and program and memory medium for executing image processing method |
JP2009171559A (ja) * | 2007-12-21 | 2009-07-30 | Canon Inc | 画像処理装置及び画像処理方法 |
JP2009171560A (ja) * | 2007-12-21 | 2009-07-30 | Canon Inc | 画像処理装置及び画像処理方法並びに画像処理方法を実行するプログラム及び記憶媒体 |
JP2010103876A (ja) * | 2008-10-27 | 2010-05-06 | Sony Corp | 画像処理装置、画像処理方法およびプログラム |
JP2011244069A (ja) * | 2010-05-14 | 2011-12-01 | Panasonic Corp | 撮像装置、集積回路および画像処理方法 |
JP2015146192A (ja) * | 2012-10-29 | 2015-08-13 | 株式会社日立国際電気 | 画像処理装置 |
Families Citing this family (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8498452B2 (en) | 2003-06-26 | 2013-07-30 | DigitalOptics Corporation Europe Limited | Digital image processing using face detection information |
US7440593B1 (en) | 2003-06-26 | 2008-10-21 | Fotonation Vision Limited | Method of improving orientation and color balance of digital images using face detection information |
US7574016B2 (en) | 2003-06-26 | 2009-08-11 | Fotonation Vision Limited | Digital image processing using face detection information |
US7471846B2 (en) | 2003-06-26 | 2008-12-30 | Fotonation Vision Limited | Perfecting the effect of flash within an image acquisition devices using face detection |
US7269292B2 (en) | 2003-06-26 | 2007-09-11 | Fotonation Vision Limited | Digital image adjustable compression and resolution using face detection information |
US7792970B2 (en) | 2005-06-17 | 2010-09-07 | Fotonation Vision Limited | Method for establishing a paired connection between media devices |
US8989453B2 (en) | 2003-06-26 | 2015-03-24 | Fotonation Limited | Digital image processing using face detection information |
US7844076B2 (en) | 2003-06-26 | 2010-11-30 | Fotonation Vision Limited | Digital image processing using face detection and skin tone information |
US8494286B2 (en) | 2008-02-05 | 2013-07-23 | DigitalOptics Corporation Europe Limited | Face detection in mid-shot digital images |
US8948468B2 (en) | 2003-06-26 | 2015-02-03 | Fotonation Limited | Modification of viewing parameters for digital images using face detection information |
US7565030B2 (en) | 2003-06-26 | 2009-07-21 | Fotonation Vision Limited | Detecting orientation of digital images using face detection information |
WO2005046221A1 (ja) * | 2003-11-11 | 2005-05-19 | Seiko Epson Corporation | 画像処理装置、画像処理方法、そのプログラムおよび記録媒体 |
US8503800B2 (en) | 2007-03-05 | 2013-08-06 | DigitalOptics Corporation Europe Limited | Illumination detection using classifier chains |
JP4513764B2 (ja) * | 2006-02-20 | 2010-07-28 | セイコーエプソン株式会社 | 画像判定装置および画像判定方法 |
WO2007142109A1 (ja) * | 2006-05-31 | 2007-12-13 | Nec Corporation | 画像高解像度化装置及び画像高解像度化方法並びにプログラム |
ATE497218T1 (de) | 2006-06-12 | 2011-02-15 | Tessera Tech Ireland Ltd | Fortschritte bei der erweiterung der aam- techniken aus grauskalen- zu farbbildern |
JP4225339B2 (ja) * | 2006-09-11 | 2009-02-18 | ソニー株式会社 | 画像データ処理装置および方法、プログラム、並びに記録媒体 |
US8055067B2 (en) | 2007-01-18 | 2011-11-08 | DigitalOptics Corporation Europe Limited | Color segmentation |
DE602008001607D1 (de) | 2007-02-28 | 2010-08-05 | Fotonation Vision Ltd | Trennung der direktionalen beleuchtungsvariabilität in der statistischen gesichtsmodellierung auf basis von texturraumzerlegungen |
JP4970557B2 (ja) | 2007-03-05 | 2012-07-11 | デジタルオプティックス・コーポレイション・ヨーロッパ・リミテッド | デジタル画像取込装置における顔検索及び検出 |
US7916971B2 (en) | 2007-05-24 | 2011-03-29 | Tessera Technologies Ireland Limited | Image processing method and apparatus |
WO2008150285A1 (en) * | 2007-05-24 | 2008-12-11 | Fotonation Vision Limited | Image processing method and apparatus |
US8068700B2 (en) * | 2007-05-28 | 2011-11-29 | Sanyo Electric Co., Ltd. | Image processing apparatus, image processing method, and electronic appliance |
US20090161982A1 (en) * | 2007-12-19 | 2009-06-25 | Nokia Corporation | Restoring images |
US8750578B2 (en) | 2008-01-29 | 2014-06-10 | DigitalOptics Corporation Europe Limited | Detecting facial expressions in digital images |
WO2009098749A1 (ja) * | 2008-02-04 | 2009-08-13 | Panasonic Corporation | 画像合成装置、画像合成方法、画像合成プログラム及び集積回路並びに撮像システム及び撮像方法 |
US7855737B2 (en) | 2008-03-26 | 2010-12-21 | Fotonation Ireland Limited | Method of making a digital camera image of a scene including the camera user |
CN106919911A (zh) | 2008-07-30 | 2017-07-04 | 快图有限公司 | 使用脸部检测的自动脸部和皮肤修饰 |
JP5230381B2 (ja) * | 2008-11-28 | 2013-07-10 | 三星電子株式会社 | 撮像装置及び撮像装置の制御方法 |
JP4735742B2 (ja) * | 2008-12-18 | 2011-07-27 | カシオ計算機株式会社 | 撮像装置、ストロボ画像生成方法、および、プログラム |
WO2010095460A1 (ja) * | 2009-02-19 | 2010-08-26 | 日本電気株式会社 | 画像処理システム、画像処理方法および画像処理プログラム |
US20100304720A1 (en) * | 2009-05-27 | 2010-12-02 | Nokia Corporation | Method and apparatus for guiding media capture |
US8520967B2 (en) * | 2009-06-26 | 2013-08-27 | Nokia Corporation | Methods and apparatuses for facilitating generation images and editing of multiframe images |
US8379917B2 (en) | 2009-10-02 | 2013-02-19 | DigitalOptics Corporation Europe Limited | Face recognition performance using additional image features |
JP2015039049A (ja) * | 2009-10-26 | 2015-02-26 | 株式会社東芝 | 画像処理装置及び画像処理方法 |
US9509911B2 (en) * | 2010-07-14 | 2016-11-29 | Nikon Corporation | Image-capturing device, and image combination program |
JP5653184B2 (ja) * | 2010-11-11 | 2015-01-14 | 三菱電機株式会社 | 画像処理装置及び方法 |
CN102572250A (zh) * | 2010-12-27 | 2012-07-11 | 华晶科技股份有限公司 | 电子装置、影像拍摄装置及其方法 |
US8379999B2 (en) * | 2011-01-18 | 2013-02-19 | Chanan Gabay | Methods, circuits, devices, apparatuses and systems for providing image composition rules, analysis and improvement |
CN103259972A (zh) * | 2012-02-17 | 2013-08-21 | 佳能企业股份有限公司 | 影像处理方法及成像装置 |
CN103685951A (zh) * | 2013-12-06 | 2014-03-26 | 华为终端有限公司 | 一种图像处理方法、装置及终端 |
JP2015177221A (ja) | 2014-03-13 | 2015-10-05 | オリンパス株式会社 | 撮像装置、撮像方法、データ記録装置、及びプログラム |
EP2937835A1 (en) * | 2014-04-22 | 2015-10-28 | Ceva D.S.P. Ltd. | System and method for generating a super-resolution image |
CN104301484B (zh) * | 2014-10-24 | 2017-08-25 | 天津市康凯特软件科技有限公司 | 展示手机程序变化过程的方法 |
CN104580905A (zh) * | 2014-12-31 | 2015-04-29 | 广东欧珀移动通信有限公司 | 一种拍照方法及终端 |
JP5846549B1 (ja) * | 2015-02-06 | 2016-01-20 | 株式会社リコー | 画像処理システム、画像処理方法、プログラム、撮像システム、画像生成装置、画像生成方法およびプログラム |
US9979890B2 (en) | 2015-04-23 | 2018-05-22 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US10007990B2 (en) * | 2015-12-24 | 2018-06-26 | Intel Corporation | Generating composite images using estimated blur kernel size |
US10009536B2 (en) | 2016-06-12 | 2018-06-26 | Apple Inc. | Applying a simulated optical effect based on data received from multiple camera sensors |
US10905402B2 (en) * | 2016-07-27 | 2021-02-02 | Canon Medical Systems Corporation | Diagnostic guidance systems and methods |
US11095816B2 (en) * | 2016-12-02 | 2021-08-17 | Sony Semiconductor Solutions Corporation | Image pickup element, image pickup method, and electronic device for image stabilization |
CN107230192B (zh) | 2017-05-31 | 2020-07-21 | Oppo广东移动通信有限公司 | 图像处理方法、装置、计算机可读存储介质和移动终端 |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
JP6960997B2 (ja) * | 2017-08-09 | 2021-11-05 | 富士フイルム株式会社 | 画像処理システム、サーバ装置、画像処理方法、及び画像処理プログラム |
WO2019127512A1 (zh) * | 2017-12-29 | 2019-07-04 | 深圳市大疆创新科技有限公司 | 拍摄设备的图像处理方法、拍摄设备及可移动平台 |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | USER INTERFACES FOR SIMULATED DEPTH EFFECTS |
US10674072B1 (en) | 2019-05-06 | 2020-06-02 | Apple Inc. | User interfaces for capturing and managing visual media |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US10672101B1 (en) * | 2019-03-04 | 2020-06-02 | Omnivision Technologies, Inc. | DRAM with simultaneous read and write for multiwafer image sensors |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11054973B1 (en) | 2020-06-01 | 2021-07-06 | Apple Inc. | User interfaces for managing media |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09261526A (ja) * | 1996-03-19 | 1997-10-03 | Olympus Optical Co Ltd | 撮像装置 |
JPH11187307A (ja) * | 1997-12-17 | 1999-07-09 | Canon Inc | 撮像装置及び撮像方法 |
JP2000152250A (ja) * | 1998-11-10 | 2000-05-30 | Canon Inc | 画像処理装置、方法及びコンピュータ読み取り可能な記憶媒体 |
JP2002112095A (ja) * | 2000-09-29 | 2002-04-12 | Minolta Co Ltd | デジタルスチルカメラ |
JP2004229004A (ja) * | 2003-01-23 | 2004-08-12 | Seiko Epson Corp | 画像生成装置、画像生成方法および画像生成プログラム |
JP2004234624A (ja) * | 2003-01-07 | 2004-08-19 | Seiko Epson Corp | 静止画像生成装置、静止画像生成方法、静止画像生成プログラム、および静止画像生成プログラムを記録した記録媒体 |
JP2004272751A (ja) * | 2003-03-11 | 2004-09-30 | Seiko Epson Corp | 複数のフレーム画像からの静止画像の生成 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5649032A (en) * | 1994-11-14 | 1997-07-15 | David Sarnoff Research Center, Inc. | System for automatically aligning images to form a mosaic image |
US5696848A (en) * | 1995-03-09 | 1997-12-09 | Eastman Kodak Company | System for creating a high resolution image from a sequence of lower resolution motion images |
US6075905A (en) * | 1996-07-17 | 2000-06-13 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
US6552744B2 (en) * | 1997-09-26 | 2003-04-22 | Roxio, Inc. | Virtual reality camera |
US6650361B1 (en) * | 1997-12-17 | 2003-11-18 | Canon Kabushiki Kaisha | Imaging apparatus control method, and a computer program product having computer program code therefor |
JP4371457B2 (ja) | 1999-02-18 | 2009-11-25 | キヤノン株式会社 | 画像処理装置、方法及びコンピュータ読み取り可能な記憶媒体 |
US6804419B1 (en) * | 1998-11-10 | 2004-10-12 | Canon Kabushiki Kaisha | Image processing method and apparatus |
AU2710201A (en) * | 2000-01-24 | 2001-07-31 | Matsushita Electric Industrial Co., Ltd. | Image composting apparatus, recording medium and program |
US6834128B1 (en) * | 2000-06-16 | 2004-12-21 | Hewlett-Packard Development Company, L.P. | Image mosaicing system and method adapted to mass-market hand-held digital cameras |
US6799920B2 (en) * | 2000-09-01 | 2004-10-05 | Precision Cover Systems, Inc. | Angle adjustable utility access and method |
JP2002077713A (ja) * | 2000-09-05 | 2002-03-15 | Minolta Co Ltd | 撮像装置 |
JP2003153080A (ja) * | 2001-11-09 | 2003-05-23 | Matsushita Electric Ind Co Ltd | 映像合成装置 |
NL1019365C2 (nl) * | 2001-11-14 | 2003-05-15 | Tno | Bepaling van een beweging van een achtergrond in een reeks beelden. |
JP4024581B2 (ja) * | 2002-04-18 | 2007-12-19 | オリンパス株式会社 | 撮像装置 |
US20040225221A1 (en) * | 2003-05-06 | 2004-11-11 | Olsson Lars Jonas | Diagnostic ultrasound imaging system with adaptive persistence |
US7474767B2 (en) * | 2003-09-24 | 2009-01-06 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Motion detection using multi-resolution image processing |
JP4461937B2 (ja) * | 2003-09-30 | 2010-05-12 | セイコーエプソン株式会社 | 低解像度の複数の画像に基づく高解像度の画像の生成 |
WO2005046221A1 (ja) * | 2003-11-11 | 2005-05-19 | Seiko Epson Corporation | 画像処理装置、画像処理方法、そのプログラムおよび記録媒体 |
US20050175235A1 (en) * | 2004-02-05 | 2005-08-11 | Trw Automotive U.S. Llc | Method and apparatus for selectively extracting training data for a pattern recognition classifier using grid generation |
JP4476723B2 (ja) * | 2004-07-14 | 2010-06-09 | アルパイン株式会社 | 画像表示装置 |
US7460730B2 (en) * | 2005-08-04 | 2008-12-02 | Microsoft Corporation | Video registration and image sequence stitching |
-
2004
- 2004-11-11 WO PCT/JP2004/017128 patent/WO2005046221A1/ja active Application Filing
- 2004-11-11 EP EP04799741A patent/EP1684506A4/en not_active Withdrawn
- 2004-11-11 CN CNB2004800330129A patent/CN100440944C/zh not_active Expired - Fee Related
- 2004-11-11 JP JP2005515385A patent/JP4325625B2/ja not_active Expired - Fee Related
- 2004-11-11 US US10/578,635 patent/US7738731B2/en not_active Expired - Fee Related
-
2010
- 2010-05-18 US US12/800,608 patent/US7961985B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09261526A (ja) * | 1996-03-19 | 1997-10-03 | Olympus Optical Co Ltd | 撮像装置 |
JPH11187307A (ja) * | 1997-12-17 | 1999-07-09 | Canon Inc | 撮像装置及び撮像方法 |
JP2000152250A (ja) * | 1998-11-10 | 2000-05-30 | Canon Inc | 画像処理装置、方法及びコンピュータ読み取り可能な記憶媒体 |
JP2002112095A (ja) * | 2000-09-29 | 2002-04-12 | Minolta Co Ltd | デジタルスチルカメラ |
JP2004234624A (ja) * | 2003-01-07 | 2004-08-19 | Seiko Epson Corp | 静止画像生成装置、静止画像生成方法、静止画像生成プログラム、および静止画像生成プログラムを記録した記録媒体 |
JP2004229004A (ja) * | 2003-01-23 | 2004-08-12 | Seiko Epson Corp | 画像生成装置、画像生成方法および画像生成プログラム |
JP2004272751A (ja) * | 2003-03-11 | 2004-09-30 | Seiko Epson Corp | 複数のフレーム画像からの静止画像の生成 |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007010807A (ja) * | 2005-06-28 | 2007-01-18 | Fuji Xerox Co Ltd | 情報処理システム、情報処理装置、情報処理方法及びコンピュータ・プログラム |
JP2007036359A (ja) * | 2005-07-22 | 2007-02-08 | Casio Comput Co Ltd | 画像合成装置、画像合成方法及びプログラム |
JP2007151080A (ja) * | 2005-10-27 | 2007-06-14 | Canon Inc | 画像処理装置及び画像処理方法 |
US8050518B2 (en) | 2005-10-27 | 2011-11-01 | Canon Kabushiki Kaisha | Image processing apparatus and method for readily identifying image data suitable for super-resolution processing |
JP2007305113A (ja) * | 2006-04-11 | 2007-11-22 | Matsushita Electric Ind Co Ltd | 画像処理方法および画像処理装置 |
JP2008033914A (ja) * | 2006-06-28 | 2008-02-14 | Matsushita Electric Ind Co Ltd | 画像読出し方法および画像拡大方法 |
JP2009171559A (ja) * | 2007-12-21 | 2009-07-30 | Canon Inc | 画像処理装置及び画像処理方法 |
JP2009171560A (ja) * | 2007-12-21 | 2009-07-30 | Canon Inc | 画像処理装置及び画像処理方法並びに画像処理方法を実行するプログラム及び記憶媒体 |
WO2009082015A1 (en) * | 2007-12-21 | 2009-07-02 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and program and memory medium for executing image processing method |
JP2010103876A (ja) * | 2008-10-27 | 2010-05-06 | Sony Corp | 画像処理装置、画像処理方法およびプログラム |
JP4623199B2 (ja) * | 2008-10-27 | 2011-02-02 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
US8368815B2 (en) | 2008-10-27 | 2013-02-05 | Sony Corporation | Image processing apparatus, image processing method, and program |
US9153289B2 (en) | 2008-10-27 | 2015-10-06 | Sony Corporation | Image processing apparatus, image processing method, and program |
JP2011244069A (ja) * | 2010-05-14 | 2011-12-01 | Panasonic Corp | 撮像装置、集積回路および画像処理方法 |
JP2015146192A (ja) * | 2012-10-29 | 2015-08-13 | 株式会社日立国際電気 | 画像処理装置 |
Also Published As
Publication number | Publication date |
---|---|
US20100232703A1 (en) | 2010-09-16 |
CN1879401A (zh) | 2006-12-13 |
US7738731B2 (en) | 2010-06-15 |
US20070133901A1 (en) | 2007-06-14 |
JPWO2005046221A1 (ja) | 2007-05-24 |
US7961985B2 (en) | 2011-06-14 |
EP1684506A1 (en) | 2006-07-26 |
CN100440944C (zh) | 2008-12-03 |
EP1684506A4 (en) | 2008-06-04 |
JP4325625B2 (ja) | 2009-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005046221A1 (ja) | 画像処理装置、画像処理方法、そのプログラムおよび記録媒体 | |
JP4120677B2 (ja) | 複数のフレーム画像からの静止画像の生成 | |
US7317558B2 (en) | System and method for image processing of multiple images | |
US7729563B2 (en) | Method and device for video image processing, calculating the similarity between video frames, and acquiring a synthesized frame by synthesizing a plurality of contiguous sampled frames | |
US7535497B2 (en) | Generation of static image data from multiple image data | |
JP2009253506A (ja) | 画像処理装置、画像処理方法、手振れ範囲推定装置、手振れ範囲推定方法、及びプログラム | |
JP6656035B2 (ja) | 画像処理装置、撮像装置および画像処理装置の制御方法 | |
JP5825256B2 (ja) | 画像処理装置、画像処理方法および画像処理用プログラム | |
JP2006092450A (ja) | 画像処置装置および画像処理方法 | |
JP2008198082A (ja) | 画像処理方法、画像処理装置、及びディジタルカメラ | |
JP4340836B2 (ja) | 画像合成装置及び画像合成プログラム | |
JP2006350936A (ja) | 画像作成装置、及び画像作成用プログラム | |
JP7118729B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP2010026734A (ja) | 画像処理装置およびその方法 | |
JPH10108003A (ja) | 画像合成装置および画像合成方法 | |
JP2011188035A (ja) | 撮像装置及びパノラマ画像合成方法並びにそのプログラム | |
JP2005094614A (ja) | 手ぶれ補正装置、手ぶれ補正方法および手ぶれ補正プログラムを記録した記録媒体 | |
JP2008310418A (ja) | 画像処理装置、画像処理プログラム及びそれらを搭載した電子カメラ | |
JP2006033232A (ja) | 画像処理装置 | |
JP2006119730A (ja) | 画像のつなぎ合わせ | |
JP2005252739A (ja) | 静止画像生成装置およびその方法 | |
JP2005277916A (ja) | 動画像処理装置、画像処理システム、動画像処理方法およびそのプログラム、記録媒体 | |
JPH0991407A (ja) | パノラマ画像合成システム及びパノラマ画像合成方法 | |
JP2009042900A (ja) | 撮像装置および画像選択装置 | |
JP3325823B2 (ja) | 映像静止画表示方法及び装置並びに映像静止画表示プログラム格納記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480033012.9 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005515385 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2007133901 Country of ref document: US Ref document number: 10578635 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004799741 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2004799741 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10578635 Country of ref document: US |